Cryptosystems and their Components, Design Objectives and ...

ABCMint is a quantum resistant cryptocurrency with the Rainbow Multivariable Polynomial Signature Scheme.

Good day, the price is going up to 0.3USDT.

ABCMint Second Foundation

ABCMint has been a first third-party organization that focuses on post-quantum cryptography research and technology and aims to help improve the ecology of ABCMint technology since 2018.


https://abcmintsf.com

https://abcmintsf.com/exchange


What is ABCMint?

ABCMint is a quantum resistant cryptocurrency with the Rainbow Multivariable Polynomial Signature Scheme.

Cryptocurrencies and blockchain technology have attracted a significant amount of attention since 2009. While some cryptocurrencies, including Bitcoin, are used extensively in the world, these cryptocurrencies will eventually become obsolete and be replaced when the quantum computers avail. For instance, Bitcoin uses the elliptic curved signature (ECDSA). If a bitcoin user?s public key is exposed to the public chain, the quantum computers will be able to quickly reverse-engineer the private key in a short period of time. It means that should an attacker decide to use a quantum computer to decrypt ECDSA, he/she will be able to use the bitcoin in the wallet.

The ABCMint Foundation has improved the structure of the special coin core to resist quantum computers, using the Rainbow Multivariable Polynomial Signature Scheme, which is quantum resisitant, as the core. This is a fundamental solution to the major threat to digital money posed by future quantum computers. In addition, the ABCMint Foundation has implemented a new form of proof of arithmetic (mining) "ABCardO" which is different from Bitcoin?s arbitrary mining. This algorithm is believed to be beneficial to the development of the mathematical field of multivariate.


Rainbow Signature - the quantum resistant signature based on Multivariable Polynomial Signature Scheme

Unbalanced Oil and Vinegar (UOV) is a multi-disciplinary team of experts in the field of oil and vinegar. One of the oldest and most well researched signature schemes in the field of variable cryptography. It was designed by J. Patarin in 1997 and has withstood more than two decades of cryptanalysis. The UOV scheme is a very simple, smalls and fast signature. However, the main drawback of UOV is the large public key, which will not be conducive to the development of block practice technology.

The rainbow signature is an improvement on the oil and vinegar signature which increased the efficiency of unbalanced oil and vinegar. The basic concept is a multi-layered structure and generalization of oil and vinegar.


PQC - Post Quantum Cryptography

The public key cryptosystem was a breakthrough in modern cryptography in the late 1970s. It has become an increasingly important part of our cryptography communications network over The Internet and other communication systems rely heavily on the Diffie-Hellman key exchange, RSA encryption, and the use of the DSA, ECDSA or related algorithms for numerical signatures. The security of these cryptosystems depends on the difficulty level of number theory problems such as integer decomposition and discrete logarithm problems. In 1994, Peter Shor demonstrated that quantum computers can solve all these problems in polynomial time, which made this security issue related to the cryptosystems theory irrelevant. This development is known as the "post-quantum cryptography" (PQC)

In August 2015, the U.S. National Security Agency (NSA) released an announcement regarding its plans to transition to quantum-resistant algorithms. In December 2016, the National Institute of Standards and Technology (NIST) announced a call for proposals for quantum-resistant algorithms. The deadline was November 30, 2017, which also included the rainbow signatures used for ABCMint.
submitted by WrapBeautiful to ABCMint [link] [comments]

Technical: Upcoming Improvements to Lightning Network

Price? Who gives a shit about price when Lightning Network development is a lot more interesting?????
One thing about LN is that because there's no need for consensus before implementing things, figuring out the status of things is quite a bit more difficult than on Bitcoin. In one hand it lets larger groups of people work on improving LN faster without having to coordinate so much. On the other hand it leads to some fragmentation of the LN space, with compatibility problems occasionally coming up.
The below is just a smattering sample of LN stuff I personally find interesting. There's a bunch of other stuff, like splice and dual-funding, that I won't cover --- post is long enough as-is, and besides, some of the below aren't as well-known.
Anyway.....

"eltoo" Decker-Russell-Osuntokun

Yeah the exciting new Lightning Network channel update protocol!

Advantages

Myths

Disadvantages

Multipart payments / AMP

Splitting up large payments into smaller parts!

Details

Advantages

Disadvantages

Payment points / scalars

Using the magic of elliptic curve homomorphism for fun and Lightning Network profits!
Basically, currently on Lightning an invoice has a payment hash, and the receiver reveals a payment preimage which, when inputted to SHA256, returns the given payment hash.
Instead of using payment hashes and preimages, just replace them with payment points and scalars. An invoice will now contain a payment point, and the receiver reveals a payment scalar (private key) which, when multiplied with the standard generator point G on secp256k1, returns the given payment point.
This is basically Scriptless Script usage on Lightning, instead of HTLCs we have Scriptless Script Pointlocked Timelocked Contracts (PTLCs).

Advantages

Disadvantages

Pay-for-data

Ensuring that payers cannot access data or other digital goods without proof of having paid the provider.
In a nutshell: the payment preimage used as a proof-of-payment is the decryption key of the data. The provider gives the encrypted data, and issues an invoice. The buyer of the data then has to pay over Lightning in order to learn the decryption key, with the decryption key being the payment preimage.

Advantages

Disadvantages

Stuckless payments

No more payments getting stuck somewhere in the Lightning network without knowing whether the payee will ever get paid!
(that's actually a bit overmuch claim, payments still can get stuck, but what "stuckless" really enables is that we can now safely run another parallel payment attempt until any one of the payment attempts get through).
Basically, by using the ability to add points together, the payer can enforce that the payee can only claim the funds if it knows two pieces of information:
  1. The payment scalar corresponding to the payment point in the invoice signed by the payee.
  2. An "acknowledgment" scalar provided by the payer to the payee via another communication path.
This allows the payer to make multiple payment attempts in parallel, unlike the current situation where we must wait for an attempt to fail before trying another route. The payer only needs to ensure it generates different acknowledgment scalars for each payment attempt.
Then, if at least one of the payment attempts reaches the payee, the payee can then acquire the acknowledgment scalar from the payer. Then the payee can acquire the payment. If the payee attempts to acquire multiple acknowledgment scalars for the same payment, the payer just gives out one and then tells the payee "LOL don't try to scam me", so the payee can only acquire a single acknowledgment scalar, meaning it can only claim a payment once; it can't claim multiple parallel payments.

Advantages

Disadvantages

Non-custodial escrow over Lightning

The "acknowledgment" scalar used in stuckless can be reused here.
The acknowledgment scalar is derived as an ECDH shared secret between the payer and the escrow service. On arrival of payment to the payee, the payee queries the escrow to determine if the acknowledgment point is from a scalar that the escrow can derive using ECDH with the payer, plus a hash of the contract terms of the trade (for example, to transfer some goods in exchange for Lightning payment). Once the payee gets confirmation from the escrow that the acknowledgment scalar is known by the escrow, the payee performs the trade, then asks the payer to provide the acknowledgment scalar once the trade completes.
If the payer refuses to give the acknowledgment scalar even though the payee has given over the goods to be traded, then the payee contacts the escrow again, reveals the contract terms text, and requests to be paid. If the escrow finds in favor of the payee (i.e. it determines the goods have arrived at the payer as per the contract text) then it gives the acknowledgment scalar to the payee.

Advantages

Disadvantages

Payment decorrelation

Because elliptic curve points can be added (unlike hashes), for every forwarding node, we an add a "blinding" point / scalar. This prevents multiple forwarding nodes from discovering that they have been on the same payment route. This is unlike the current payment hash + preimage, where the same hash is used along the route.
In fact, the acknowledgment scalar we use in stuckless and escrow can simply be the sum of each blinding scalar used at each forwarding node.

Advantages

Disadvantages

submitted by almkglor to Bitcoin [link] [comments]

Quantum Computing Vs. Blockchain

Quantum Computing Vs. Blockchain


The cryptocurrency community has long been discussing one technical feature of the blockchain, which directly affects its future. We are talking about the threat to the blockchain from the so-called quantum computing. The fact is that if these threats are implemented, crypto assets will not be able to function technically and all problems with their regulation will disappear by themselves.
Indeed, what is the point of creating a serious regulatory system for an instrument that will soon become simply inoperable?
Most modern cryptocurrencies are built on a particular cryptographic algorithm that ensures its security. The level of protection is determined by the amount of work required by the key, the password that determines the final result of the cryptographic conversion. It is known that when solving cryptography problems, the classical computer performs total testing of possible keys, in turn, one after another. A quantum computer can instantly test a set of keys and establish a combination that has the maximum probability of being true and thereby compromise the cryptosystem.
The threat to bitcoin is that high-speed quantum computers, as a result, will be able to “create problems” to the encryption processes and digital signatures used in the technology of blockchain and virtual currencies. Ultra-fast calculations would in principle allow to forge smart contracts and steal “coins”.
Most cryptocurrencies use public-key encryption algorithms for communications and, in particular, digital signatures. Public key cryptography is based on one-way mathematical functions-operations that are simple in one direction and difficult in the other. If we use quantum computers rather than classical ones to solve the factorization problem, it is solved much faster. Quantum computer allows for a couple of minutes to determine the secret key on the public, and the knowledge of the secret key allows you to access the address of the bitcoin network. It turns out that the owner of the quantum computer will be able to break the encryption system with a public key and write off (steal) “coins” from the appropriate address. This feature of quantum computing is the main danger for bitcoin.
According to some estimates, the quantum computer will be able to determine the secret key on the open in 2027.
Some commentators believe that with the advent of full-fledged quantum computers, the era of cryptocurrencies and blockchain will come to its logical end — the cryptography systems on which cryptocurrencies are based will be compromised, and the cryptocurrencies themselves will become worthless. Allegedly, the first thing that the owner of a quantum computer will do is quickly mine the remaining bitcoins, ethers and other popular crypto-coins. Experts have estimated that bitcoin hacking will require a quantum computer with a capacity of 10 thousand qubits, and it is not so long to wait for it — perhaps ten years, or even less.
IBM 50Q System: An IBM cryostat wired for a 50 qubit system. Photo from the www.ibm.com
However, not everyone shares this opinion.
According to new forecasts, a commercially acceptable version of the quantum computer will not appear until 2040. Many cryptocurrency experts are sure that by this moment developers will be able to prepare and adapt the blockchain to new realities. They will be able to modify the cryptocurrency code and protect the technologies used in it from hacking.
Analysts, however, emphasize that although an attacker with a powerful quantum computer will be able to get the secret key from the public, it is impossible to get the public key from the bitcoin address of the recipient of the transaction. The public key is converted to a bitcoin address by several unidirectional hash functions that are resistant to quantum computation. However, in fact, the public key still gets into the network one day. This occurs when the transaction is signed by the sender of the “coin”. Otherwise, the network will not be able to confirm the transaction, because there is no other way to verify the authenticity of the sender’s signature.
The widespread fear of a direct threat to bitcoin by quantum computing is exaggerated and comes from ignorance. In fact, using crowdsourcing, blockchain technology solves many problems, including reducing threats to its security from quantum computers. That is why the network based on the blockchain for superior protection network and platform of centralized architecture. Dr. Brennan has analyzed the threat of blockchain technologies by modern systems of quantum computing. He investigated the potential of a quantum computer in terms of the possibility of its use “for manipulating the blockchain in the centralization of hashing power” and assessed the probability of disclosure of the key of the encryption system that underlies the mechanism of protecting users of the blockchain. The results of the study show that the existing developments in the field of quantum computing are very far from the “imaginary possibilities” of quantum technologies — the modern quantum infrastructure is characterized by speed, absolutely insufficient to solve extremely complex problems such as the search for an acceptable time encryption key.
At least on the horizon of the next 10 years, the speed of quantum computers will be insufficient compared to the capabilities of modern mining machines.

Bitcoin will not give way before quantum computing.

Can Quantum Computing Take Over Blockchain?

Practice crosses out any theoretical constructions that claim that quantum computing is able to “master” the blockchain. This is due to the limited capabilities of existing technical means and the ongoing development of the blockchain protection system. The technology that can compromise the work of the blockchain is becoming obsolete by the time of its appearance, it is constantly about ten years behind the development of blockchain technology.
The head of the laboratory of quantum computing John Martinis from Google also rejected the assumption that quantum computing could pose a direct threat to blockchain systems and cryptocurrencies in the near future. Martinis believes that the process of creating quantum computers will take at least a decade, and the practical implementation of effective quantum computing will require even more time. He believes that the creation of quantum devices “is really problematic and much more difficult than the creation of a classical computer”.
From another angle, one of the world’s leading experts in the field of bitcoin and blockchain Andreas Antonopoulos also looked at the problem under consideration. Andreas Antonopoulos official Twitter page.
He is convinced that the US NSA and other intelligence agencies will not use a quantum computer against bitcoin, even if they have such weapons.
Andreas Antonopoulos said:
“I’m not at all worried that the NSA might have a quantum computer, because the basic security law says: if you have a powerful secret weapon, you do not use it. You need a very significant excuse to use it”.
He cited as an example the decryption by the British cryptographer Alan Turing of the German military machine encryption Telegraph messages “Enigma” during the Second World War. The Germans used this machine, in particular, for secret communication in the Navy. The British government then decided to keep this success in the strictest confidence, and by any means to hide the source of information (it was removed from the communication channels). The British had even deliberately not to prevent the sinking of their ships by the Germans, because as soon as the enemy realizes the compromise of the codes used by him, he immediately takes measures to Refine its technology.
The question of the threat of quantum computing is not the existence of a quantum computer, but its power — the number of quantum bits (qubits). Special services at this stage of development can not have enough power to attack the Bitcoin blockchain. However, a really real problem will arise when quantum computers become commercially available, but not so much that everyone can use them in their bitcoin wallet. During this transition period, bitcoin will need to switch to new algorithms. It is not yet clear how this transition will take place.
Researchers estimate the exploitability of the ideas of quantum-secured blockchain, the essence of which is that the Central element in the protection technology of the blockchain to make the quantum technology of quantum communication. Quantum communications (or, more precisely, quantum key distribution) guarantee security based on the laws of physics, not on the complexity of solving mathematical problems, as in the case of public-key cryptography. As a result, the quantum blockchain (it can be defined as a set of methods of using quantum technologies for calculations; the work of the quantum blockchain is based on the use of quantum communications to authenticate the participants of operations) will be invulnerable to attacks using a quantum computer.
Brennen and Tucker agree that quantum computing, at least on paper, definitely poses a threat to the security of blockchain networks. Feed her fears caused by the injection of panic sensational articles in the media. Tucker believes that the talk that quantum computing poses an immediate threat to the blockchain is distracting from the really important topics for discussion. The quantum threat to bitcoin cannot be completely excluded, but the level of this threat is estimated as minimal, especially if we take into account the high reliability of the network of this cryptocurrency and powerful incentives to ensure the highest level of its security.
Perhaps, from all this, it is possible to draw two conclusions. First, bitcoin in the current modification is really vulnerable to quantum computing. Secondly, it is equally obvious that there are and there will be many opportunities in the future to improve it. On the one hand, it is, in particular, alternative systems of cryptographic protection of transactions, and including on the basis of public-key ciphers, on the other — quantum communication systems that guarantee the security of communication without the use of mathematics.
So quantum systems promise new means of protection of virtual currency blockchains. If we turn to ordinary money, it can be noted that as technological development is constantly evolving and their means of protection. Remember how to protect against counterfeiting of conventional paper money is constantly coming up with new and unusual technologies. From all this, it follows that from a technical point of view, crypto assets are for a long time, which makes their regulation useful.
Material developed by the Legal Department of EdJoWa Holding
submitted by IMBA-Exchange to u/IMBA-Exchange [link] [comments]

LTO Network - Hybrid blockchain built for business

📷
INTRODUCTION
📷
We're celebrating the 10th year of Blockchain technology. During that time, we had a lot of experience. The market is slowly starting to fix itself. One of the biggest reasons for this is the increase in confidence in the market. The market-free elimination of fraud and security holes has made blockchain technology more powerful every day. It is possible to say that there are some difficulties because it is a very new technology. But if you think like me, you accept that block-chain technology is a weapon capable of changing our lives. The blockchain has been used today in banking, artificial intelligence, entertainment, municipalities, shopping and much more. The Fiat that we know is slowly reaching the power to do everything that money can do. The blockchain not only facilitated the use of money, but it also allowed us to use it in a very functional way. In addition, it seems to have done a lot better job in the distribution of income than the fiat money. But the blockchain should go on its way. This amazing technology that changes the way we look at money and banks, comes for more.
Problem Analysis
📷
Nowadays, many altcoin blockchains using a limited number of hybrid infrastructure. The most common and most known is Ethereum infrastructure. About 200 altcoins are using this infrastructure. In fact, Binance, the world's largest stock exchange, uses this infrastructure for its own coin, BNB. We should talk about Bitcoin's infrastructure, which dominates 52% of the market, according to Coinmarketcap data before going to the Ethereum. Compared to fiat money, Bitcoin's trading speed and commission fees are really incredible. Bitcoin operations can change blocks before 1 hour after the completion of the approval process. This means that the international fiat is much faster than monetary transactions and can be met with fewer commissions. But ten years after bitcoin's invention, we can say that we are now using much more advanced technology. For example, the Ethereum infrastructure can perform these operations in almost minutes. Projects such as EOS, which is also a blockchain, and other projects aim to shorten these times further. But it is a fact known by everyone that these infrastructures have problems with safety, speed and integration. Because of these vulnerabilities, the Crypto money worth millions of dollars each year is becoming the target of the attacks. This kind of situation that damages the blockchain technology also continues to be a serious obstacle to the realization of the blockchain Revolution. However, new projects, or rather projects that use a solid infrastructure, do not seem likely to encounter these problems in the future.
📷
What is the LTO network ?
LTO network is a decentralized and highly efficient blockchain infrastructure that provides maximum efficiency to its users and enables the integration of blockchain Infrastructures into existing systems ready for production. The LTO network project is a very advanced technology product with 10 years of blockchain experience. I remember the days when the project was chosen as the Most Valuable ICO of the year when the ICO was made in 2018. This year, the project has proven itself by providing services to various customers from around the world.
Business Process Modeling is a common strategy that small, medium and large enterprises use to maintain their business continuity. Creating a visual proof of a workflow process is a step by step to ensure that it is analyzed, developed and automated. Unlike procedures written in a native language or in a programming language, these models can be understood by both people and computers.
For inter-organizational collaboration, there is no modelling to enhance communication. The parties concerned must indicate the process to be used as a binding agreement; it is called a live contract on the LTO platform. The LTO platform creates a temporary blockchain for each live contract. This type of blockchain is not designed as a literal book.
📷
Who are the actors on the LTO Network?
📷
There are 4 different token holders in the LTO network system and they are classified.

Running node;

LTO Network Award pool and operating system
📷
We mentioned above that you have a chance to earn passive income through LTO network. These stakes vary according to the token that the customers hold. There are a number of rewards available for keeping these tokens in the wallet. And token holders are rewarded within the framework of the ratios shown below.

  1. If a user has 10% of the total number of tokens on the network and contributes 10% of the total transactions, this token holder's block validation rate will be 105%.
  2. If a user has 10% of the total token supply but does not contribute to any authentication transition based on network operations, the block validation rate will be below 5%.
The stake pool calculation of the LTO network is calculated by the following formula: a number of tokens staked/contribution to the approval of blocks. This process determines your rate of the number of tokens staked.
📷
LTO Blockchain and ERC-20 Wallet
📷
LTO network has 2 different coins. Since these coins are used for different purposes, they may also be used for different purposes. These two different coins can be exchanged with each other through "bridge troll". Odds are 1:1.
There is a system called “bridge” between the pools of the main net and ERC-20 tokens, and both pools serve different purposes.:

📷
4 Main Dynamics of LTO Network
📷
LTO network has shaped the project according to 4 main dynamics. These are development, growth, shaking-out and maturity.
The first of these stages, the relatively high level of Development, will be shared after token distribution. Those who buy this token early will be rewarded as a reward by bringing it to the "net zero" point mentioned in the chart.
At the stage of Growth, the passive customer ratio will increase to the extent that it is adapted to the platform. The speed of transactions will increase as you go to the "net zero" point.
"Net zero" incentives in the Shake-out will increase the founders. The entrance to the prize pool will be a bit narrower as the stakes increase.
In the process of Maturing, the market will grow as the majority of customers become partners. However, passive holders will be rewarded by the system with a much wider expansion of the stock rewards.
CONCLUSION
📷
Although the LTO network seems very complex, it is much more understandable when viewed closely. Since it has a much more advanced structure than other blockchain technologies, there are many opportunities to be gained from it. The system is already listed in Coinmarketcap and is traded on many stock markets. Anyone who wants to invest in this wonderful project and using the Node registered in the system can provide the opportunity to earn passive income. In addition, if you already have a cryptosystem, you can negotiate with the LTO network and get the chance to improve the quality of your business.
TOKEN ALLOCATION
📷
📷
MEET THE TEAM
📷
📷
For more information:
📷
📷 WEBSITE: https://lto.network 📷 TELEGRAM: https://t.me/joinchat/AJWQTUDKtDlsuGHVFb40eQ 📷 WHITEPAPER: https://lto.network/documents/LTO%20Network%20-%20Technical%20Paper.pdf 📷 TWITTER: https://twitter.com/ltonetwork 📷 MEDIUM: https://medium.com/ltonetwork 📷 REDDIT: https://www.reddit.com/livecontracts/ 📷 YOUTUBE: https://www.youtube.com/channel/UCaHcF-xterKYTKSpY4xgKiw 📷 GITHUB: https://github.com/legalthings
📷
LTO Wallet Adress: 3Jkjdy2bViwGsML6emPDAGBF5XC7MwYCq4g MEW: 0x30Da745c024923B55f3a73E530e18382eF2130eB Telegram:@nuxxorcoin
submitted by aerios01 to LTONetwork [link] [comments]

Did you know that LISK uses Schnorr signature-based Ed25519 scheme which is more secure, much faster, more scalable than secp256k1 which is used by Bitcoin, Ethereum, Stratis

Schnorr signatures have been praised by Bitcoin developers for a while Adam Back admitted it was more secure
https://bitcointalk.org/index.php?topic=511074.msg5727641#msg5727641
And it is much faster (scalable for verifying hundred thousands of transactions per second)
https://bitcointalk.org/index.php?topic=103172.0
DJB and friends claim that with their ed25519 curve (the "ed" is for Edwards) and careful implementation they can do batch verification of 70,000+ signatures per second on a cheap quad-core Intel Westmere chip, which is now several generations old. Given advances in CPUs over time, it seems likely that in the very near future the cited software will be capable of verifying many hundreds of thousands of signatures per second even if you keep the core count constant. But core counts are not constant - it seems likely that in 10 years or so 24-32 core chips will be standard even on consumer desktops. At that point a million signatures per second or more doesn't sound unreasonable.
Gavin Andresen, the former Bitcoin Chief Scientist want to support it in Bitcoin
https://www.reddit.com/Bitcoin/comments/2jw5pm/im_gavin_andresen_chief_scientist_at_the_bitcoin/clfp3xj/
Bitcoin developers discussed to include it https://github.com/bitcoin-core/secp256k1/pull/212
However, it is still in wishlist https://en.bitcoin.it/wiki/Softfork_wishlist
Ed25519 is used in Tahoe-FS, one of most respected crypto project https://moderncrypto.org/mail-archive/curves/2014/000069.html
LISK is IoT friendly
The good feature of Schnorr signature is that by design it does not require lot of computations on the signer side. Therefore, you can use it even on a computationally weak platform (think of a smart card or RFID), or on a platform with no hardware support for multiple precision arithmetic.
Advantages of Schnorr signatures
According to David Harding, Schnorr signatures can bring many benefits
Smaller multisig transactions
Slightly smaller for all transactions
Plausible deniability for multisig
Plausible deniability of authorized parties using a third-party organizer (which doesn't need to be trusted with private keys), it's possible to prevent signers from knowing whether their private key is part of the set of signing keys.
Theoretical better security properties: Also, the ed25519 page linked above describes several ways it is resistant to side-channel attacks, which can allow hardware wallets to operate safely in less secure environments.
Faster signature verification: it likely takes fewer CPU cycles to verify an ed25519 Schnorr signature than a secp256k1 ECDSA signature.
Multi-crypto multisig: with two (slightly) different cryptosystems to choose from, high-security users can create 2-of-2 multisig pubkey scripts that require both ECDSA and Schnorr signatures, so their bitcoins can't be stolen if only one cryptosystem is broken.
https://bitcoin.stackexchange.com/questions/34288/what-are-the-implications-of-schnorr-signatures
Scalable multisig transactions
The magic of Schnorr signatures is most evident in their ability to aggregate signatures from multiple inputs into a single one to be validated for every individual transactions. The scaling implications of this are obvious: aggregation allows for non-trivial savings in terms of transmission, validation & storage for every peer on the network. The chart below illustrates the historical impact a switch to Schnorr signatures would have had in terms of space savings on the blockchain. (Alex B.) Infamous malleability is non-issue in LISK Provably no inherent signature malleability, while ECDSA has a known malleability, and lacks a proof that no other forms exist. Note that Witness Segregation already makes signature malleability not result in transaction malleability, however. https://www.elementsproject.org/elements/schnorr-signatures/
Bitcoin has malleability bugs
submitted by Corinne1992 to CryptoCurrency [link] [comments]

Why having Shadowcash, Dash, and Zcash in the cryptocurrencies section is incredibly irresponsible.

Cryptocurrencies as many of you know allow people, for the first time, to be in control of their money. This is important for a multitude of reasons related to personal freedoms and expressions, but not all cryptocurrencies can be trusted so blindly. Bitcoin obviously is the most famous and is well received among freedom-seeking individuals, but Bitcoin is not fungible and all transactions are publicly viewable similar to publishing your bank statements online which is not very good for privacy.
Now the reason why I think the aforementioned cryptocurrencies in the title are very poor candidates to be published on https://www.privacytools.io is because they have very poor claims for security and privacy, two things many of us here value very much, and do not make any assurances to the bold claims they make for privacy and security which relied on and encouraged for use by privacytools.io.
Shadowcash was created as a fork of the Bitcoin codebase with ring-signtaures implemented poorly as an optional sending feature that was meant to obscure transactions. One of Monero's cryptographers, Shen Noether, had been reviewing Shadowcash's ring-signature implementation and found that it was completely and irrevocably broken. Further reading can be found on this topic here https://shnoe.wordpress.com/2016/02/11/de-anonymizing-shadowcash-and-oz-coin/ and here https://github.com/ShenNoetheDeanon Such lack of understanding of cryptography and security from developers who are creating applications regular people would be relying on for their personal privacy should be no candidate on an esteemed website like https://www.privacytools.io Not to mention a shady release that equated to over 6 million coins being mined in the first two weeks of the network's lifetime and quickly pivoting to a Proof-of-Stake network that no longer mints new coins. This event very quickly and unfairly distributed large portions of the currency to the developers of Shadowcash questioning the legitimacy of this cryptocurrency and whether or not their goal of the project is privacy and security at all.
Dash(formerly known as Darkcoin) was also a fork of Bitcoin and to distinguish itself it implemented a modified version of CoinJoin for optional use into the protocol which as history's shown with Bitcoin's version of CoinJoin makes zero assurances to privacy because the blockchain and all their transactions are still publicly auditable by following the previous outputs. In the Dash network there are centralized nodes called "Master Nodes" which execute many operations in the network from CoinJoin to the locking of transactions for faster "confirmations". Dash is touted as a private cryptocurrency that has even less privacy than Bitcoin http://weuse.cash/2016/10/26/warning-dash-privacy-is-worse-than-bitcoin/ The release of Darkcoin(later rebranded to Dash), as described in the previous link, involved mining over 2 millions coins(roughly 30% of the coin supply) over the course of two days. This is similar to Shadowcash's launch which questions the legitimacy of such a "privacy-focused" project and should warrant zero trust to the claims they make for privacy or security for users of privacytools.io
On to Zcash another cryptocurrency who forked Bitcoin's code for the goal of creating a private and secure cryptocurrency that utilizes a very brand new cryptosystem called zk-snarks. I won't go into much detail about how zk-snarks work because this topic alone goes far beyond the scope of this post. Essentially Zcash's optional feature for sending private transactions break the link between sender and recepient to obscure transaction data on the blockchain. Now zk-snarks are an incredibly experimental cryptosystem that has not even been thoroughly peer-reviewed by the academia community. This fact should alone warrant caution to the use of Zcash for privacy because cryptography, barely three years old, that has not has stood the test of time to be considered secure and reliable is under very poor assumptions to be relying on for privacy and security for privacytools.io users. Not to mention a trusted-setup by two employees of Zcash Electric Coin Company and four other individuals to generate zk-snarks parameters for the launch of Zcash. This blind trust that is required for Zcash allows for infinite coins to be invisibly created in the network if these six individuals colluded or were using compromised computers for the generation. To end this section the corporation, Zcash Electric Coin Company, will consequently receive 20% of all block rewards for the next four years. Such corporate interest and unproven cryptographic algorithms can not be trusted by individuals who are seeking privacy and security on privacytools.io
To reiterate on these three cryptocurrencies: snake-oil cryptography that is unverified to be secure by academic researchers and make false claims to privacy and security should not be published on a website like privacytools.io that encourages sound, secure privacy-oriented applications to be used over traditional options.
Now the argument for why Monero is a strong candidate to be featured on privacytools.io alonsgide Bitcoin is because Monero does not make false claims to security and privacy. Monero has been thoroughly reviewed by academic researchers(cryptonote review, ledger journal publication, researchers from University College London, and more here). Monero uses widely acclaimed and reviewed cryptography created by Daniel J. Bernstein, who is an outspoken cryptographer that is a huge supporter of privacy-related projects. Monero is, unlike all other cryptocurrencies, private-by-default allowing its users to freely transact on its network without adversaries surveilling all past and present transactions. Monero is in addition a thriving open-source project that is also creating an alternative security-focused i2p router. The goals of Monero are very inline with what privacy-seeking individuals desire. We simply want to reclaim our privacy in a world where privacy is relentlessly attacked and criticized. I hope the administrators of https://www.privacytools.io can re-evaluate the cryptocurrencies featured in the cryptocurrency section because I can not emphasize enough how detrimental such recommendations can be for using a cryptocurrency that may greatly threaten users privacy.
Thank you.
submitted by yuvzst to privacytoolsIO [link] [comments]

How Secured Are Blockchain Based Transaction?

How Secured Are Blockchain Based Transaction?
Blockchain, the technology of distributed ledger that supports Bitcoin, is the wave of the financial future.
By transforming industries with advanced and optimized architecture, this ingenious technology eliminates intermediaries in many essential services, reducing costs and increasing efficiency.
However, blockchain technology and its applications are still in their early years, so different concerns do exist.

https://smartcryptosolution.org/
Businesses around the world are exploring innovative ways to harness the disruptive power of blockchain technology to securely exchange stocks and assets.
Several government authorities, institutions, and businesses have tested the blockchain development technology and found it incredibly safe and immutable.
Is blockchain technology powerful enough to stop cybercriminals, improve security and transparency in all transactions?
  • Decentralization, cryptography and consent are the three defining characteristics of blockchain technology contributing to its immutability.
  • When you own a bitcoin, use digital keys (a public and private key pair) to confirm your ownership and access to funds in a secure cryptosystem.
  • Whenever there is a digital asset transaction, its details are stored in a block that includes the data, the hash and the hash of the previous block.
  • The data present in a block vary depending on the type of blockchain.
  • For a bitcoin transaction, the data would be the details of a transaction such as a sender, the recipient and the value of coins.
  • The hash is a unique string of encrypted data defined to identify the corresponding block and its data.
  • This hash is processed with new transactions to create a new hash for the next chain block.
  • Because each block is bound to its previous block by hashing, any change in any part of the data would change all the hashes, making all subsequent blocks invalid and false.
  • Each time a new transaction is launched in a public blockchain, it is transmitted via an open peer-to-peer (P2P) network, accessible to all.
  • A group of people working on the peer-to-peer bitcoin network, also called miners, uses its computational power to record transactions and verify their accuracy by solving a complex cryptographic puzzle, also called "proof of work".
  • Whenever the miners solve the puzzle and mines the block, they must be validated by the remaining nodes in accordance with the consent protocol.
  • If the block is validated, it will be added to the blockchain of the network and the miner who will have solved the puzzle will be rewarded with a cryptocurrency blockchain.
  • On the other hand, private blockchains operate on authorized networks in which only certain access rights are assigned to the selected entities and create new transactions on the channel.
  • All blocks in the blockchain are linked in chronological order and are monitored via P2P nodes, making record tampering or network corruption highly uncertain.
  • If someone needs to hack the bitcoin blockchain, then they have to hack not just a block, but all the previous blocks, and then repeat the work validation chain on all the computers connected via the peer-to-peer network in a very short time, which is almost impossible.
Does this mean that blockchain technology is extremely unalterable?
Well no. We should rather consider scenarios in which the blockchain can be altered.
If a miner or a mining pool controls more than half of the hash power on a blockchain network, the power would be reconcentrated into a single entity, thus opening the door for attacks and illegal gains.
Conclusion
In conclusion, it can be said that absolute immutability does not exist.
But this decentralized and distributed digital blockchain ledger has enormous potential to meet the ever-changing needs of different sectors. And without a doubt, the cryptocurrency market will be safer and more reliable in the near future.

submitted by smartcryptosolution to u/smartcryptosolution [link] [comments]

The Proof is in the Pudding: Proofs of Work for Solving Discrete Logarithms

Cryptology ePrint Archive: Report 2018/939
Date: 2018-10-05
Author(s): Marcella Hastings, Nadia Heninger, Eric Wustrow

Link to Paper


Abstract
We propose a proof of work protocol that computes the discrete logarithm of an element in a cyclic group. Individual provers generating proofs of work perform a distributed version of the Pollard rho algorithm. Such a protocol could capture the computational power expended to construct proof-of-work-based blockchains for a more useful purpose, as well as incentivize advances in hardware, software, or algorithms for an important cryptographic problem. We describe our proposed construction and elaborate on challenges and potential trade-offs that arise in designing a practical proof of work.

References
  1. SpaceMint: A cryptocurrency based on proofs of space. In: FC’18. Springer (2018)
  2. Back, A.: Hashcash-a denial of service counter-measure (2002)
  3. Ball, M., Rosen, A., Sabin, M., Vasudevan, P.N.: Proofs of work from worst-case assumptions. In: CRYPTO 2018. Springer International Publishing (2018)
  4. Barbulescu, R., Gaudry, P., Joux, A., Thom´e, E.: A heuristic quasi-polynomial algorithm for discrete logarithm in finite fields of small characteristic. In: EUROCRYPT’14 (2014)
  5. Barker, E., Chen, L., Roginsky, A., Vassilev, A., Davis, R.: SP 800-56A Revision 3. Recommendation for pair-wise key establishment schemes using discrete logarithm cryptography. National Institute of Standards & Technology (2018)
  6. Biryukov, A., Pustogarov, I.: Proof-of-work as anonymous micropayment: Rewarding a Tor relay. In: FC’15. Springer (2015)
  7. Bitansky, N., Canetti, R., Chiesa, A., Goldwasser, S., Lin, H., Rubinstein, A., Tromer, E.: The hunting of the SNARK. Journal of Cryptology 30(4) (2017)
  8. Bloom, B.H.: Space/time trade-offs in hash coding with allowable errors. Commun. ACM 13(7), 422–426 (Jul 1970). https://doi.org/10.1145/362686.362692
  9. Boneh, D., Bonneau, J., B¨unz, B., Fisch, B.: Verifiable delay functions. In: Annual International Cryptology Conference. pp. 757–788. Springer (2018)
  10. Bos, J.W., Kaihara, M.E., Kleinjung, T., Lenstra, A.K., Montgomery, P.L.: Solving a 112-bit prime elliptic curve discrete logarithm problem on game consoles using sloppy reduction. International Journal of Applied Cryptography 2(3) (2012)
  11. Buterin, V.: Uncle rate and transaction fee analysis, https://blog.ethereum.org/2016/10/31/uncle-rate-transaction-fee-analysis/
  12. Certicom ECC challenge (1997), http://certicom.com/images/pdfs/challenge-2009.pdf, Updated 10 Nov 2009. Accessed via Web Archive
  13. Diffie, W., Hellman, M.: New directions in cryptography. IEEE transactions on Information Theory 22(6), 644–654 (1976)
  14. Dwork, C., Naor, M.: Pricing via processing or combatting junk mail. In: Annual International Cryptology Conference. pp. 139–147. Springer (1992)
  15. Ethereum Project: Ethereum white paper, https://github.com/ethereum/wiki/wiki/White-Paper\#modified-ghost-implementation
  16. Gordon, D.M.: Discrete logarithms in GF(P) using the number field sieve. SIAM J. Discret. Math. 6(1), 124–138 (Feb 1993). https://doi.org/10.1137/0406010
  17. Jakobsson, M., Juels, A.: Proofs of work and bread pudding protocols. In: Secure Information Networks, pp. 258–272. Springer (1999)
  18. King, S.: Primecoin: Cryptocurrency with prime number proof-of-work (2013)
  19. Kleinjung, T., Diem, C., Lenstra, A.K., Priplata, C., Stahlke, C.: Computation of a 768-bit prime field discrete logarithm. In: EUROCRYPT’17. Springer (2017)
  20. Lepinski, M., Kent, S.: Additional Diffie-Hellman groups for use with IETF standards. RFC 5114, RFC Editor (2008), http://rfc-editor.org/rfc/rfc5114.txt
  21. Lochter, M.: Blockchain as cryptanalytic tool. Cryptology ePrint Archive, Report 2018/893 (2018), https://eprint.iacr.org/2018/893.pdf
  22. Miller, A., Juels, A., Shi, E., Parno, B., Katz, J.: Permacoin: Repurposing Bitcoin work for data preservation. In: 2014 IEEE S&P. pp. 475–490. IEEE (2014)
  23. Nakamoto, S.: Bitcoin: A peer-to-peer electronic cash system. White paper (2008)
  24. National Institute of Standards and Technology: FIPS PUB 186-4: Digital Signature Standard (DSS). National Institute of Standards and Technology (Jul 2013)
  25. Percival, C., Josefsson, S.: The scrypt password-based key derivation function. RFC 7914, RFC Editor (Aug 2016), http://rfc-editor.org/rfc/rfc7914.txt
  26. Pollard, J.M.: Monte carlo methods for index computation (mod p). In: Mathematics of Computation. vol. 32 (1978)
  27. Poon, J., Buterin, V.: Plasma: Scalable autonomous smart contracts (2017)
  28. Shanks, D.: Class number, a theory of factorization, and genera. In: Proc. of Symp. Math. Soc., 1971. vol. 20, pp. 41–440 (1971)
  29. Sompolinsky, Y., Zohar, A.: Secure high-rate transaction processing in Bitcoin. In: FC’15. pp. 507–527. Springer (2015)
  30. Teske, E.: Speeding up Pollard’s rho method for computing discrete logarithms. In: ANTS-III. pp. 541–554. Springer-Verlag, Berlin, Heidelberg (1998)
  31. Valenta, L., Adrian, D., Sanso, A., Cohney, S., Fried, J., Hastings, M., Halderman, J.A., Heninger, N.: Measuring small subgroup attacks against Diffie-Hellman. In: NDSS (2017)
  32. Valenta, L., Sullivan, N., Sanso, A., Heninger, N.: In search of CurveSwap: Measuring elliptic curve implementations in the wild. In: EuroS&P. IEEE (2018)
  33. Van Oorschot, P.C., Wiener, M.J.: Parallel collision search with cryptanalytic applications. Journal of cryptology 12(1), 1–28 (1999)
  34. de Vries, A.: Bitcoin’s growing energy problem. Joule 2(5), 801–805 (2018)
  35. Wenger, E., Wolfger, P.: Harder, better, faster, stronger: elliptic curve discrete logarithm computations on FPGAs. Journal of Cryptographic Engineering (2016)
  36. Wiener, M.J., Zuccherato, R.J.: Faster attacks on elliptic curve cryptosystems. In: International workshop on selected areas in cryptography. Springer (1998)
  37. Wustrow, E., VanderSloot, B.: DDoSCoin: Cryptocurrency with a malicious proofof-work. In: WOOT (2016)
submitted by dj-gutz to myrXiv [link] [comments]

Did you know that LISK uses Schnorr signature-based Ed25519 scheme which is more secure, much faster, more scalable than secp256k1 which is used by Bitcoin, Ethereum, Stratis

Schnorr signatures have been praised by Bitcoin developers for a while
Adam Back admitted it was more secure
https://bitcointalk.org/index.php?topic=511074.msg5727641#msg5727641
And it is much faster (scalable for verifying hundred thousands of transactions per second)
https://bitcointalk.org/index.php?topic=103172.0
DJB and friends claim that with their ed25519 curve (the "ed" is for Edwards) and careful implementation they can do batch verification of 70,000+ signatures per second on a cheap quad-core Intel Westmere chip, which is now several generations old. Given advances in CPUs over time, it seems likely that in the very near future the cited software will be capable of verifying many hundreds of thousands of signatures per second even if you keep the core count constant. But core counts are not constant - it seems likely that in 10 years or so 24-32 core chips will be standard even on consumer desktops. At that point a million signatures per second or more doesn't sound unreasonable.
Gavin Andresen, the former Bitcoin Chief Scientist want to support it in Bitcoin https://www.reddit.com/Bitcoin/comments/2jw5pm/im_gavin_andresen_chief_scientist_at_the_bitcoin/clfp3xj/
Bitcoin developers discussed to include it https://github.com/bitcoin-core/secp256k1/pull/212
However, it is still in wishlist https://en.bitcoin.it/wiki/Softfork_wishlist
Ed25519 is used in Tahoe-FS, one of most respected crypto project https://moderncrypto.org/mail-archive/curves/2014/000069.html
LISK is IoT friendly
The good feature of Schnorr signature is that by design it does not require lot of computations on the signer side. Therefore, you can use it even on a computationally weak platform (think of a smart card or RFID), or on a platform with no hardware support for multiple precision arithmetic.
Advantages of Schnorr signatures
According to David Harding, Schnorr signatures can bring many benefits
https://bitcoin.stackexchange.com/questions/34288/what-are-the-implications-of-schnorr-signatures
Scalable multisig transactions
The magic of Schnorr signatures is most evident in their ability to aggregate signatures from multiple inputs into a single one to be validated for every individual transactions. The scaling implications of this are obvious: aggregation allows for non-trivial savings in terms of transmission, validation & storage for every peer on the network. The chart below illustrates the historical impact a switch to Schnorr signatures would have had in terms of space savings on the blockchain. (Alex B.)
Infamous malleability is non-issue in LISK
Provably no inherent signature malleability, while ECDSA has a known malleability, and lacks a proof that no other forms exist. Note that Witness Segregation already makes signature malleability not result in transaction malleability, however. https://www.elementsproject.org/elements/schnorr-signatures/
Bitcoin has malleability bugs
submitted by pcdinh to Lisk [link] [comments]

Bitcoin: A Peer-to-Peer Electronic Cash System

Date: 2008-10-31
Author(s): Satoshi Nakamoto

Link to Paper


Abstract
A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution. Digital signatures provide part of the solution, but the main benefits are lost if a trusted third party is still required to prevent double-spending. We propose a solution to the double-spending problem using a peer-to-peer network. The network timestamps transactions by hashing them into an ongoing chain of hash-based proof-of-work, forming a record that cannot be changed without redoing the proof-of-work. The longest chain not only serves as proof of the sequence of events witnessed, but proof that it came from the largest pool of CPU power. As long as a majority of CPU power is controlled by nodes that are not cooperating to attack the network, they'll generate the longest chain and outpace attackers. The network itself requires minimal structure. Messages are broadcast on a best effort basis, and nodes can leave and rejoin the network at will, accepting the longest proof-of-work chain as proof of what happened while they were gone.

References
[1] W. Dai, "b-money," http://www.weidai.com/bmoney.txt, 1998.
[2] H. Massias, X.S. Avila, and J.-J. Quisquater, "Design of a secure timestamping service with minimal trust requirements," In 20th Symposium on Information Theory in the Benelux, May 1999.
[3] S. Haber, W.S. Stornetta, "How to time-stamp a digital document," In Journal of Cryptology, vol 3, no 2, pages 99-111, 1991.
[4] D. Bayer, S. Haber, W.S. Stornetta, "Improving the efficiency and reliability of digital time-stamping," In Sequences II: Methods in Communication, Security and Computer Science, pages 329-334, 1993.
[5] S. Haber, W.S. Stornetta, "Secure names for bit-strings," In Proceedings of the 4th ACM Conference on Computer and Communications Security, pages 28-35, April 1997.
[6] A. Back, "Hashcash - a denial of service counter-measure," http://www.hashcash.org/papers/hashcash.pdf, 2002.
[7] R.C. Merkle, "Protocols for public key cryptosystems," In Proc. 1980 Symposium on Security and Privacy, IEEE Computer Society, pages 122-133, April 1980.
[8] W. Feller, "An introduction to probability theory and its applications," 1957.
submitted by dj-gutz to myrXiv [link] [comments]

Minimizing Trust in Hardware Wallets with Two Factor Signatures

Cryptology ePrint Archive: Report 2019/006
Date: 2019-01-02
Author(s): Antonio Marcedone, Rafael Pass, abhi shelat

Link to Paper


Abstract
We introduce the notion of two-factor signatures (2FS), a generalization of a two-out-of-two threshold signature scheme in which one of the parties is a hardware token which can store a high-entropy secret, and the other party is a human who knows a low-entropy password. The security (unforgeability) property of 2FS requires that an external adversary corrupting either party (the token or the computer the human is using) cannot forge a signature. This primitive is useful in contexts like hardware cryptocurrency wallets in which a signature conveys the authorization of a transaction. By the above security property, a hardware wallet implementing a two-factor signature scheme is secure against attacks mounted by a malicious hardware vendor; in contrast, all currently used wallet systems break under such an attack (and as such are not secure under our definition). We construct efficient provably-secure 2FS schemes which produce either Schnorr signature (assuming the DLOG assumption), or EC-DSA signatures (assuming security of EC-DSA and the CDH assumption) in the Random Oracle Model, and evaluate the performance of implementations of them. Our EC-DSA based 2FS scheme can directly replace currently used hardware wallets for Bitcoin and other major cryptocurrencies to enable security against malicious hardware vendors.

References
[1] Jes´us F Almansa, Ivan Damg˚ard, and Jesper Buus Nielsen. Simplified threshold RSA with adaptive and proactive security. In Eurocrypt, volume 4004, pages 593–611. Springer, 2006.
[2] Dan Boneh, Xuhua Ding, Gene Tsudik, and Chi-Ming Wong. A method for fast revocation of public key certificates and security capabilities. In USENIX Security Symposium, pages 22–22, 2001.
[3] Jan Camenisch, Anja Lehmann, Gregory Neven, and Kai Samelin. Virtual smart cards: how to sign with a password and a server, 2016.
[4] Yvo Desmedt and Yair Frankel. Threshold cryptosystems. In Advances in Cryptology – CRYPTO 1989, pages 307–315. Springer, 1990.
[5] J. Doerner, Y. Kondi, E. Lee, and a. shelat. Secure two-party threshold ECDSA from ECDSA assumptions. In 2018 IEEE Symposium on Security and Privacy (SP), pages 595–612, 2018.
[6] Rosario Gennaro and Steven Goldfeder. Fast multiparty threshold ecdsa with fast trustless setup. In Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security, pages 1179–1194. ACM, 2018.
[7] Rosario Gennaro, Stanis law Jarecki, Hugo Krawczyk, and Tal Rabin. Robust and efficient sharing of RSA functions. In Advances in Cryptology – CRYPTO 1996, pages 157–172. Springer, 1996.
[8] Steven Goldfeder, Rosario Gennaro, Harry Kalodner, Joseph Bonneau, Joshua A Kroll, Edward W Felten, and Arvind Narayanan. Securing bitcoin wallets via a new DSA/ECDSA threshold signature scheme, 2015.
[9] Yehuda Lindell. Fast secure two-party ECDSA signing. In Advances in Cryptology – CRYPTO 2017, pages 613–644. Springer, 2017.
[10] Yehuda Lindell and Ariel Nof. Fast secure multiparty ecdsa with practical distributed key generation and applications to cryptocurrency custody. In Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security, pages 1837–1854. ACM, 2018.
[11] Philip MacKenzie and Michael K Reiter. Delegation of cryptographic servers for capture-resilient devices. Distributed Computing, 16(4):307–327, 2003.
[12] Philip MacKenzie and Michael K Reiter. Networked cryptographic devices resilient to capture. International Journal of Information Security, 2(1):1–20, 2003.
[13] Antonio Marcedone, Rafael Pass, and abhi shelat. Minimizing trust in hardware wallets with two factor signatures. Cryptology ePrint Archive, Report 2018/???, 2018.
[14] Microchip. Atecc608a datasheet, 2018.
[15] Antonio Nicolosi, Maxwell N Krohn, Yevgeniy Dodis, and David Mazieres. Proactive two-party signatures for user authentication. In NDSS, 2003.
[16] Marek Palatinus, Pavol Rusnak, Aaron Voisine, and Sean Bowe. Mnemonic code for generating deterministic keys (bip39). https://github.com/bitcoin/bips/blob/mastebip-0039.mediawiki.
[17] Tal Rabin. A simplified approach to threshold and proactive RSA. In Advances in Cryptology – CRYPTO 1998, pages 89–104. Springer, 1998.
[18] T.C. Sottek. Nsa reportedly intercepting laptops purchased online to install spy malware, December 2013. [Online; posted 29-December-2013; https://www.theverge.com/2013/12/29/5253226/nsacia-fbi-laptop-usb-plant-spy].
submitted by dj-gutz to myrXiv [link] [comments]

Compact Multi-Signatures for Smaller Blockchains

Cryptology ePrint Archive: Report 2018/483
Date: 2018-06-10
Author(s): Dan Boneh, Manu Drijvers, Gregory Neven

Link to Paper


Abstract
We construct new multi-signature schemes that provide new functionality. Our schemes are designed to reduce the size of the Bitcoin blockchain, but are useful in many other settings where multi-signatures are needed. All our constructions support both signature compression and public-key aggregation. Hence, to verify that a number of parties signed a common message m, the verifier only needs a short multi-signature, a short aggregation of their public keys, and the message m. We give new constructions that are derived from Schnorr signatures and from BLS signatures. Our constructions are in the plain public key model, meaning that users do not need to prove knowledge or possession of their secret key.
In addition, we construct the first short accountable-subgroup multi-signature (ASM) scheme. An ASM scheme enables any subset S of a set of n parties to sign a message m so that a valid signature discloses which subset generated the signature (hence the subset S is accountable for signing m). We construct the first ASM scheme where signature size is only O(k) bits over the description of S, where k is the security parameter. Similarly, the aggregate public key is only O(k) bits, independent of n. The signing process is non-interactive. Our ASM scheme is very practical and well suited for compressing the data needed to spend funds from a t-of-n Multisig Bitcoin address, for any (polynomial size) t and n.

References
  1. Ahn, J.H., Green, M., Hohenberger, S.: Synchronized aggregate signatures: new definitions, constructions and applications. In: Al-Shaer, E., Keromytis, A.D., Shmatikov, V. (eds.) ACM CCS 10: 17th Conference on Computer and Communications Security. pp. 473–484. ACM Press, Chicago, Illinois, USA (Oct 4–8, 2010)
  2. Andresen, G.: m-of-n standard transactions. Bitcoin improvement proposal (BIP) 0011 (2011)
  3. Bagherzandi, A., Cheon, J.H., Jarecki, S.: Multisignatures secure under the discrete logarithm assumption and a generalized forking lemma. In: Ning, P., Syverson, P.F., Jha, S. (eds.) ACM CCS 08: 15th Conference on Computer and Communications Security. pp. 449–458. ACM Press, Alexandria, Virginia, USA (Oct 27–31, 2008)
  4. Bagherzandi, A., Jarecki, S.: Multisignatures using proofs of secret key possession, as secure as the Diffie-Hellman problem. In: Ostrovsky, R., Prisco, R.D., Visconti, I. (eds.) SCN 08: 6th International Conference on Security in Communication Networks. Lecture Notes in Computer Science, vol. 5229, pp. 218–235. Springer, Heidelberg, Germany, Amalfi, Italy (Sep 10–12, 2008)
  5. Bansarkhani, R.E., Sturm, J.: An efficient lattice-based multisignature scheme with applications to bitcoins. In: Foresti, S., Persiano, G. (eds.) CANS 16: 15th International Conference on Cryptology and Network Security. Lecture Notes in Computer Science, vol. 10052, pp. 140–155. Springer, Heidelberg, Germany, Milan, Italy (Nov 14–16, 2016)
  6. Barreto, P.S.L.M., Lynn, B., Scott, M.: On the selection of pairing-friendly groups. In: Matsui, M., Zuccherato, R.J. (eds.) SAC 2003: 10th Annual International Workshop on Selected Areas in Cryptography. Lecture Notes in Computer Science, vol. 3006, pp. 17–25. Springer, Heidelberg, Germany, Ottawa, Ontario, Canada (Aug 14–15, 2004)
  7. Bellare, M., Namprempre, C., Neven, G.: Unrestricted aggregate signatures. In: Arge, L., Cachin, C., Jurdzinski, T., Tarlecki, A. (eds.) ICALP 2007: 34th International Colloquium on Automata, Languages and Programming. Lecture Notes in Computer Science, vol. 4596, pp. 411–422. Springer, Heidelberg, Germany, Wroclaw, Poland (Jul 9–13, 2007)
  8. Bellare, M., Namprempre, C., Pointcheval, D., Semanko, M.: The one-more-RSAinversion problems and the security of Chaum’s blind signature scheme. Journal of Cryptology 16(3), 185–215 (Jun 2003)
  9. Bellare, M., Neven, G.: Multi-signatures in the plain public-key model and a general forking lemma. In: Juels, A., Wright, R.N., Vimercati, S. (eds.) ACM CCS 06: 13th Conference on Computer and Communications Security. pp. 390–399. ACM Press, Alexandria, Virginia, USA (Oct 30 – Nov 3, 2006)
  10. Boldyreva, A.: Threshold signatures, multisignatures and blind signatures based on the gap-Diffie-Hellman-group signature scheme. In: Desmedt, Y. (ed.) PKC 2003: 6th International Workshop on Theory and Practice in Public Key Cryptography. Lecture Notes in Computer Science, vol. 2567, pp. 31–46. Springer, Heidelberg, Germany, Miami, FL, USA (Jan 6–8, 2003)
  11. Boldyreva, A., Gentry, C., O’Neill, A., Yum, D.H.: Ordered multisignatures and identity-based sequential aggregate signatures, with applications to secure routing. In: Ning, P., di Vimercati, S.D.C., Syverson, P.F. (eds.) ACM CCS 07: 14th Conference on Computer and Communications Security. pp. 276–285. ACM Press, Alexandria, Virginia, USA (Oct 28–31, 2007)
  12. Boneh, D., Gentry, C., Lynn, B., Shacham, H.: Aggregate and verifiably encrypted signatures from bilinear maps. In: Biham, E. (ed.) Advances in Cryptology – EUROCRYPT 2003. Lecture Notes in Computer Science, vol. 2656, pp. 416–432. Springer, Heidelberg, Germany, Warsaw, Poland (May 4–8, 2003)
  13. Boneh, D., Lynn, B., Shacham, H.: Short signatures from the Weil pairing. In: Boyd, C. (ed.) Advances in Cryptology – ASIACRYPT 2001. Lecture Notes in Computer Science, vol. 2248, pp. 514–532. Springer, Heidelberg, Germany, Gold Coast, Australia (Dec 9–13, 2001)
  14. Brogle, K., Goldberg, S., Reyzin, L.: Sequential aggregate signatures with lazy verification from trapdoor permutations - (extended abstract). In: Wang, X., Sako, K. (eds.) Advances in Cryptology – ASIACRYPT 2012. Lecture Notes in Computer Science, vol. 7658, pp. 644–662. Springer, Heidelberg, Germany, Beijing, China (Dec 2–6, 2012)
  15. Budroni, A., Pintore, F.: Efficient hash maps to G2 on BLS curves. Cryptology ePrint Archive, Report 2017/419 (2017), http://eprint.iacr.org/2017/419
  16. Burmester, M., Desmedt, Y., Doi, H., Mambo, M., Okamoto, E., Tada, M., Yoshifuji, Y.: A structured ElGamal-type multisignature scheme. In: Imai, H., Zheng, Y. (eds.) PKC 2000: 3rd International Workshop on Theory and Practice in Public Key Cryptography. Lecture Notes in Computer Science, vol. 1751, pp. 466–483. Springer, Heidelberg, Germany, Melbourne, Victoria, Australia (Jan 18–20, 2000)
  17. Castelluccia, C., Jarecki, S., Kim, J., Tsudik, G.: A robust multisignatures scheme with applications to acknowledgment aggregation. In: Blundo, C., Cimato, S. (eds.) SCN 04: 4th International Conference on Security in Communication Networks. Lecture Notes in Computer Science, vol. 3352, pp. 193–207. Springer, Heidelberg, Germany, Amalfi, Italy (Sep 8–10, 2005)
  18. Certicom Research: Sec 2: Recommended elliptic curve domain parameters. Tech. rep., Certicom Research (2010)
  19. Chang, C.C., Leu, J.J., Huang, P.C., Lee, W.B.: A scheme for obtaining a message from the digital multisignature. In: Imai, H., Zheng, Y. (eds.) PKC’98: 1st International Workshop on Theory and Practice in Public Key Cryptography. Lecture Notes in Computer Science, vol. 1431, pp. 154–163. Springer, Heidelberg, Germany, Pacifico Yokohama, Japan (Feb 5–6, 1998)
  20. Coron, J.S., Naccache, D.: Boneh et al.’s k-element aggregate extraction assumption is equivalent to the Diffie-Hellman assumption. In: Laih, C.S. (ed.) Advances in Cryptology – ASIACRYPT 2003. Lecture Notes in Computer Science, vol. 2894, pp. 392–397. Springer, Heidelberg, Germany, Taipei, Taiwan (Nov 30 – Dec 4, 2003)
  21. Drijvers, M., EdalatNejad, K., Ford, B., Neven, G.: Okamoto beats Schnorr: On the provable security of multi-signatures. Cryptology ePrint Archive, Report 2018/417 (2018), https://eprint.iacr.org/2018/417
  22. Fuentes-Casta˜neda, L., Knapp, E., Rodr´ıguez-Henr´ıquez, F.: Faster hashing to ð2. In: Miri, A., Vaudenay, S. (eds.) SAC 2011: 18th Annual International Workshop on Selected Areas in Cryptography. Lecture Notes in Computer Science, vol. 7118, pp. 412–430. Springer, Heidelberg, Germany, Toronto, Ontario, Canada (Aug 11–12, 2012)
  23. Gentry, C., O’Neill, A., Reyzin, L.: A unified framework for trapdoor-permutationbased sequential aggregate signatures. In: Abdalla, M., Dahab, R. (eds.) PKC 2018: 21st International Conference on Theory and Practice of Public Key Cryptography, Part II. Lecture Notes in Computer Science, vol. 10770, pp. 34–57. Springer, Heidelberg, Germany, Rio de Janeiro, Brazil (Mar 25–29, 2018)
  24. Gentry, C., Ramzan, Z.: Identity-based aggregate signatures. In: Yung, M., Dodis, Y., Kiayias, A., Malkin, T. (eds.) PKC 2006: 9th International Conference on Theory and Practice of Public Key Cryptography. Lecture Notes in Computer Science, vol. 3958, pp. 257–273. Springer, Heidelberg, Germany, New York, NY, USA (Apr 24–26, 2006)
  25. Hardjono, T., Zheng, Y.: A practical digital multisignature scheme based on discrete logarithms. In: Seberry, J., Zheng, Y. (eds.) Advances in Cryptology – AUSCRYPT’92. Lecture Notes in Computer Science, vol. 718, pp. 122–132. Springer, Heidelberg, Germany, Gold Coast, Queensland, Australia (Dec 13–16, 1993)
  26. Harn, L.: Group-oriented (t, n) threshold digital signature scheme and digital multisignature. IEE Proceedings-Computers and Digital Techniques 141(5), 307–313 (1994)
  27. Horster, P., Michels, M., Petersen, H.: Meta-multisignature schemes based on the discrete logarithm problem. In: Information Securitythe Next Decade. pp. 128–142. Springer (1995)
  28. Itakura, K., Nakamura, K.: A public-key cryptosystem suitable for digital multisignatures. Tech. rep., NEC Research and Development (1983)
  29. Komano, Y., Ohta, K., Shimbo, A., Kawamura, S.: Formal security model of multisignatures. In: Katsikas, S.K., Lopez, J., Backes, M., Gritzalis, S., Preneel, B. (eds.) ISC 2006: 9th International Conference on Information Security. Lecture Notes in Computer Science, vol. 4176, pp. 146–160. Springer, Heidelberg, Germany, Samos Island, Greece (Aug 30 – Sep 2, 2006)
  30. Le, D.P., Bonnecaze, A., Gabillon, A.: Multisignatures as secure as the DiffieHellman problem in the plain public-key model. In: Shacham, H., Waters, B. (eds.) PAIRING 2009: 3rd International Conference on Pairing-based Cryptography. Lecture Notes in Computer Science, vol. 5671, pp. 35–51. Springer, Heidelberg, Germany, Palo Alto, CA, USA (Aug 12–14, 2009)
  31. Li, C.M., Hwang, T., Lee, N.Y.: Threshold-multisignature schemes where suspected forgery implies traceability of adversarial shareholders. In: Santis, A.D. (ed.) Advances in Cryptology – EUROCRYPT’94. Lecture Notes in Computer Science, vol. 950, pp. 194–204. Springer, Heidelberg, Germany, Perugia, Italy (May 9–12, 1995)
  32. Lu, S., Ostrovsky, R., Sahai, A., Shacham, H., Waters, B.: Sequential aggregate signatures and multisignatures without random oracles. In: Vaudenay, S. (ed.) Advances in Cryptology – EUROCRYPT 2006. Lecture Notes in Computer Science, vol. 4004, pp. 465–485. Springer, Heidelberg, Germany, St. Petersburg, Russia (May 28 – Jun 1, 2006)
  33. Lysyanskaya, A., Micali, S., Reyzin, L., Shacham, H.: Sequential aggregate signatures from trapdoor permutations. In: Cachin, C., Camenisch, J. (eds.) Advances in Cryptology – EUROCRYPT 2004. Lecture Notes in Computer Science, vol. 3027, pp. 74–90. Springer, Heidelberg, Germany, Interlaken, Switzerland (May 2–6, 2004)
  34. Ma, C., Weng, J., Li, Y., Deng, R.: Efficient discrete logarithm based multisignature scheme in the plain public key model. Designs, Codes and Cryptography 54(2), 121–133 (2010)
  35. Maxwell, G., Poelstra, A., Seurin, Y., Wuille, P.: Simple schnorr multi-signatures with applications to bitcoin. Cryptology ePrint Archive, Report 2018/068 (2018), https://eprint.iacr.org/2018/068/20180118:124757
  36. Maxwell, G., Poelstra, A., Seurin, Y., Wuille, P.: Simple schnorr multi-signatures with applications to bitcoin. Cryptology ePrint Archive, Report 2018/068 (2018), https://eprint.iacr.org/2018/068/20180520:191909
  37. Merkle, R.C.: A digital signature based on a conventional encryption function. In: Pomerance, C. (ed.) Advances in Cryptology – CRYPTO’87. Lecture Notes in Computer Science, vol. 293, pp. 369–378. Springer, Heidelberg, Germany, Santa Barbara, CA, USA (Aug 16–20, 1988)
  38. Micali, S., Ohta, K., Reyzin, L.: Accountable-subgroup multisignatures: Extended abstract. In: ACM CCS 01: 8th Conference on Computer and Communications Security. pp. 245–254. ACM Press, Philadelphia, PA, USA (Nov 5–8, 2001)
  39. Michels, M., Horster, P.: On the risk of disruption in several multiparty signature schemes. In: International Conference on the Theory and Application of Cryptology and Information Security. pp. 334–345. Springer (1996)
  40. Nakamoto, S.: Bitcoin: A peer-to-peer electronic cash system (2008), http://bitcoin.org/bitcoin.pdf
  41. Neven, G.: Efficient sequential aggregate signed data. In: Smart, N.P. (ed.) Advances in Cryptology – EUROCRYPT 2008. Lecture Notes in Computer Science, vol. 4965, pp. 52–69. Springer, Heidelberg, Germany, Istanbul, Turkey (Apr 13–17, 2008)
  42. Ohta, K., Okamoto, T.: A digital multisignature scheme based on the Fiat-Shamir scheme. In: Imai, H., Rivest, R.L., Matsumoto, T. (eds.) Advances in Cryptology – ASIACRYPT’91. Lecture Notes in Computer Science, vol. 739, pp. 139–148. Springer, Heidelberg, Germany, Fujiyoshida, Japan (Nov 11–14, 1993)
  43. Ohta, K., Okamoto, T.: Multi-signature schemes secure against active insider attacks. IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences 82(1), 21–31 (1999)
  44. Okamoto, T.: Provably secure and practical identification schemes and corresponding signature schemes. In: Brickell, E.F. (ed.) Advances in Cryptology – CRYPTO’92. Lecture Notes in Computer Science, vol. 740, pp. 31–53. Springer, Heidelberg, Germany, Santa Barbara, CA, USA (Aug 16–20, 1993)
  45. Park, S., Park, S., Kim, K., Won, D.: Two efficient RSA multisignature schemes. In: Han, Y., Okamoto, T., Qing, S. (eds.) ICICS 97: 1st International Conference on Information and Communication Security. Lecture Notes in Computer Science, vol. 1334, pp. 217–222. Springer, Heidelberg, Germany, Beijing, China (Nov 11–14, 1997)
  46. Pointcheval, D., Stern, J.: Security arguments for digital signatures and blind signatures. Journal of Cryptology 13(3), 361–396 (2000)
  47. Ristenpart, T., Yilek, S.: The power of proofs-of-possession: Securing multiparty signatures against rogue-key attacks. In: Naor, M. (ed.) Advances in Cryptology – EUROCRYPT 2007. Lecture Notes in Computer Science, vol. 4515, pp. 228–245. Springer, Heidelberg, Germany, Barcelona, Spain (May 20–24, 2007)
  48. Schnorr, C.P.: Efficient signature generation by smart cards. Journal of Cryptology 4(3), 161–174 (1991)
  49. Scott, M., Benger, N., Charlemagne, M., Perez, L.J.D., Kachisa, E.J.: Fast hashing to g2 on pairing-friendly curves. In: Shacham, H., Waters, B. (eds.) PAIRING 2009: 3rd International Conference on Pairing-based Cryptography. Lecture Notes in Computer Science, vol. 5671, pp. 102–113. Springer, Heidelberg, Germany, Palo Alto, CA, USA (Aug 12–14, 2009)
submitted by dj-gutz to myrXiv [link] [comments]

What do banks think about bitcoin?🏦 #bitcoin

What do banks think about bitcoin?🏦 #bitcoin
Representatives of the world's leading banks have repeatedly accused Bitcoin of being a financial pyramid, and its real value is zero. Let's see what caused such a negative critique and how the position of bankers has changed, as the developing of the crypto-currency.
Why are bankers against bitcoin? The modern financial system operates according to the following principle. States issue national currencies and establish rules for their use. And banks are engaged in storing money and processing payments, but their main task is to monitor the implementation of the rules that the states have established. So, banks have a monopoly on money disposal. In fact, they use our money for their own purposes and force us to pay for these services. And there is no alternative to this order of things, to be exact, there wasn’t before crypto currency has appeared. The creator of Bitcoin Satoshi Nakamoto challenged the existing financial system by offering a payment network that is able to work without banks and politicians. Of course, bankers did not like this, because they do not want to lose the right to dispose of other people's money. Criticism of bitcoin The media like writing about how politicians and representatives of the current financial system criticize bitcoin. For example, let's take the head of the US financial holding JPMorgan Chase Jamie Daymon, who called bitcoin fraud and promised to fire any of his employee who is engaged in cryptotrading. Because such employees violate the policy of the bank and they are also "dumb" if they contacted with bitcoin. Also, the billionaire investor Howard Marks was unflattering about bitcoin. From his words, bitcoin has no real value and is an "unreasonable whim". To discredit Bitcoin, not only public statements were used, but also allegedly financial studies. So analysts of Morgan Stanley banking holding came to the conclusion that the real price of bitcoin was zero.
https://preview.redd.it/bt8x23zxkdr11.jpg?width=1080&format=pjpg&auto=webp&s=50e4c397fafdcde50e4c9abb96cd1a919ad60c5f
How has the ratio of banks to crypto-currencies changed?
Gradually critics of bitcoin refuse their words, as Jamie Dimon and Howard Marks have already done. These rich and influential people apologized to the crypto community, but not so much their public statements as real actions are important. JPMorgan Chase has created a separate unit to develop a crypto-currency strategy and is working on its own blockchain platform. The details are kept secret, but the world's leading bank headed by Jamie Daymon clearly decided not to stay away from crypto market. Another well-known investment bank Goldman Sachs has invested tens of millions of dollars in the Circle payment service, which offers trading in crypto-currencies and recently bought a large crypto exchange instrument Poloniex. Asian banks also did not stay aloof. For example, Japan's leading financial company SBI Holding launched its own crypto-exchange VCTRADE. conclusions Banks are not going to give green light to bitcoin, so attacks and accusations will continue. And the fact that several prominent representatives of the banking sector have publicly changed their minds - still does not mean anything. At the same time, bankers see their benefits from being present at the crypto-currency market and take steps to find their own niche. This will attract new capital and customers, which will give the crypto-loans a significant impetus for growth. In such a situation, it is extremely important that the cryptosystem community maintain its independence and does not fall under the total control of banks. Bear in mind that the real fight is just beginning!
submitted by iTradeBit to u/iTradeBit [link] [comments]

Drivechain RfD -- Follow Up | Paul Sztorc | Jun 10 2017

Paul Sztorc on Jun 10 2017:
Hi everyone,
It has been 3 weeks -- responses so far have been really helpful. People
jumped right in, and identified unfinished or missing parts much faster
than I thought they would (ie, ~two days). Very impressive.
Currently, we are working on the sidechain side of blind merged mining.
As you know, most of the Bitcoin cryptosystem is about finding the
longest chain, and displaying information about this chain. CryptAxe is
editing the sidechain code to handle reorganizations in a new way (an
even bigger departure than Namecoin's, imho).
I believe that I have responded to all the on-list objections that were
raised. I will 1st summarize the on-list objections, and 2nd summarize
the off-list discussion (focusing on three key themes).
On-List Objection Summary
In general, they were:
audit', I pointed out that it is actually optional (and, therefore,
free), and that it doesn't affect miners relative to each other, and
that it can be done in an ultra-cheap semi-trusted way with high
reliability.
maximizing to BMM sidechains, because the equation (Tx Fees - Zero Cost)
is always positive.
subsidy" case and the "sidechain" case. He cites the asymmetry I point
out below (in #2). I replied, and I give an answer below.
responded again, in general he seemed to raise many of the points
addressed in #1 (below).
that if 51% can reorg, they can also filter out the reorg proof. We are
at their mercy in all cases (for better or worse).
for a fee market, I pointed out that this limit does not need to be
imposed on miners by nodes...miners would be willing-and-able to
self-impose such a limit, as it maximizes their revenues.
sidechain, I pointed out my strong disagreement ("Unrestrained smart
contract execution will be the death of most of the interesting
applications...[could] destabilize Bitcoin itself") and introduced my
prion metaphor.
'ratchet' concept. I explained it to ZmnSCPxj in my reply. We had not
coded it at the time, but there is code for it now [1]. Tier proposed a
rachet design, but I think ours is better (Tier did not find ours at
all, because it is buried in obscure notes, because I didn't think
anyone would make it this far so quickly).
identified with our NOP earlier.
of the OP Bribe amount between nodes and miners. I'm afraid I mostly
ignored these for now, as we aren't there yet.
political reasons, and I responded that in such a case, miners are free
to simply avoid ACKing, or to acquiesce to political pressure. Neither
affect the mainchain.
opportunity to create a pretext to kick other miners off the network. I
replied that it would not, and I also brought up the fact that my
Bitcoin security model was indifferent to which people happened to be
mining at any given time. I continue to believe that "mining
centralization" does not have a useful definition.
my belief that they would be useful, and linked to my site
(drivechain.info/projects) which contains a number of sidechain
use-cases, and cited my personal anecdotal experiences.
that I felt that I had indeed done this minimization. My view is that
Peter felt erroneously that it was possible to involve miners less,
because he neglected [1] that a 51% miner group is already involved
maximally, as they can create most messages and filter any message, and
[2] that there are cases where we need miners to filter out harmful
interactions among multiple chains (just as they filter out harmful
interactions among multiple txns [ie, "double spends"]). Peter has not
yet responded to this rebuttal.
out that sidechains+BMM is client-side validation. I asked Peter for
CS-V code, so that we can compare the safety and other features.
over the emphasis on frequency/speed of withdrawals. Also Sergio
emphasizes a hybrid model, which does not interest me.
If I missed any objections, I hope someone will point them out.
Off-List / Three Points of Ongoing Confusion
Off list, I have repeated the a similar conversation perhaps 6-10 times
over the past week. There is a cluster of remaining objections which
centers around three topics -- speed, theft, and antifragility. I will
reply here, and add the answers to my FAQ (drivechain.info/faq).
  1. Speed
This objection is voiced after I point out that side-to-main transfers
("withdrawals") will probably take a long time, for example 5 months
each ( these are customizable parameters, and open for debate -- but if
withdrawals are every x=3 months, and only x=1 withdrawal can make
forward progress [on the mainchain] at a time, and only x=1 prospective
withdrawal can be assembled [by the sidechain] at a time, then we can
expect total withdrawal time to average 4.5 months [(.5)*3+3] ). The
response is something like "won't such a system be too slow, and
therefore unusable?".
Imho, replies of this kind disregard the effect of atomic cross-chain
swaps, which are instant.
( In addition, while side-to-main transfers are slow, main-to-side
transfers are quite fast, x~=10 confirmations. I would go as far as to
say that, just as the Lightning Network is enabled by SegWit and CSV,
Drivechain is enabled by the atomic swaps and of Counterparty-like
embedded consensus. )
Thanks to atomic swaps, someone can act as an investment banker or
custodian, and purchase side:BTC at a (tiny, competitive discount) and
then transfer those side-to-main at a minimal inconvenience (comparable
to that of someone who buys a bank CD). Through market activities, the
entire system becomes exactly as patient as its most-patient members.
As icing on the cake, people who aren't planning on using their BTC
anytime soon (ie "the patient") can even get a tiny investment yield, in
return for providing this service.
  1. Security
This objection usually says something like "Aren't you worried that 51%
[hashrate] will steal the coins, given that mining is so centralized
these days?"
The logic of drivechain is to take a known fact -- that miners do not
steal from exchanges (by coordinating to doublespend deposits to those
exchanges) -- and generalize it to a new situation -- that [hopefully]
miners will not steal from sidechains (by coordinating to make 'invalid'
withdrawals from them).
My generalization is slightly problematic, because "a large mainchain
reorg today" would hit the entire Bitcoin system and de-confirm all of
the network's transactions, whereas a sidechain-theft would hit only a
small portion of the system. This asymmetry is a problem because of the
1:1 peg, which is explicitly symmetrical -- the thief makes off coins
that are worth just as much as those coins that he did not attack.
The side:BTC are therefore relatively more vulnerable to theft, which
harms the generalization.
As I've just explained, to correct this relative deficiency, we add
extra inconvenience for any sidechain thievery, which is in this case
the long long withdrawal time of several months. (Which is also the main
distinction between DC and extension blocks).
I cannot realistically imagine an erroneous withdrawal persisting for
several weeks, let alone several months. First, over a timeframe of this
duration, there can be no pretense of 'we made an innocent mistake', nor
one that 'it is too inconvenient for us to fix this problem'. This
requires the attacker to be part of an explicitly malicious conspiracy.
Even in the conspiring case, I do not understand how miners would
coordinate the distribution of the eventual "theft" payout, ~3 months
from now -- if new hashrate comes online between now and then, does it
get a portion? -- if today's hashrate closes down, does it get a reduced
portion? -- who decides? I don't think that an algorithm can decide
(without creating a new mechanism, which -I believe- would have to be
powered by ongoing sustainable theft [which is impossible]), because the
withdrawal (ie the "theft") has to be initiated, with a known
destination, before it accumulates 3 months worth of acknowledgement.
Even if hashrate were controlled exclusively by one person, such a theft
would obliterate the sidechain-tx-fee revenue from all sidechains, for a
start. It would also greatly reduce the market price of [mainchain] BTC,
I feel, as it ends the sidechain experiment more-or-less permanently.
And even that di...[message truncated here by reddit bot]...
original: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-June/014559.html
submitted by dev_list_bot to bitcoin_devlist [link] [comments]

Please read our Frequently Asked Questions (FAQ)

What is CDY?
Bitcoin Candy(CDY) is a new chain forked from Bitcoin Cash at the height of 512666. The original BCH holders will be compensated with 1000 CDY for every BCH held. New features will be added to the forked chain and we will explore anti-quantum attacks solution on this chain.
What is anti-quantum attacks thing?
In the past few years, D-Wave, IBM, Intel and other technological giants invest a lot of manpower and resources to increase research and development of quantum computers, i.e., Google have embedded D-Wave quantum computer into its cloud platform; the research team under Prof. Pan Weijian in Chinese University of Technology have made breakthrough achievements in quantum communications. Under such circumstance, the quantum age will no longer stay in science fiction conjecture (perhaps 5-10 years will come).
The development of quantum computer will not only bring great changes to people's lives, but also pose a serious threat to traditional public key cryptography. Public key cryptosystems such as ECDSA and RSA will be solved in polynomial time by these quantum computers. So, virtual currency like Bitcoin which use ECDSA as a signature algorithm will become unsafe. To find post-quantum digital signature algorithm is of vital important.
Our team has a deep post-quantum cryptography background and will conduct research and experiments on the CDY chain for practical public key signature algorithms in the post-quantum-era blockchain. Why forked from BCH, can BTC holders get free CDY?
On August 1, 2017, Bitcoin community finally ended its years-long expansion war by splitting the original bitcoin into two chains, Bitcoin cash (BCH) and segwit Bitcoin (inheriting BTC ticker). We think BCH is more in line with Satoshi Nakamoto's vision of bitcoin "a peer-to-peer electronic cash system" and will have a brighter future. Only who hold BCH at height 512666(about January 13 ) can get free CDY at the rate of 1BCH : 1000CDY. What is the total supply of CDY?
CDY will have a total supply of 21 billion, of which 1% will be pre-mined.
How to claim my free CDY?
To get free CDY, you need to hold Bitcoin Cash before height 512666 (about January 13).
1.If your BCH is stored in a wallet where you can control private key yourself, you will definitely get free CDY. Just waiting for the wallet developer to provide feature to claim.
2.If your BCH is stored in the exchange, the exchange will receive free CDY, please pay attention whether the exchange will provide CDY's collection function or not.If the exchange you store BCH does not support CDY, to avoid the loss of assets it is recommended to withdraw BCH to a wallet which you can contral private key or to exchanges that support CDY.
submitted by momagic to BitcoinCandy [link] [comments]

What do banks think about bitcoin?

What do banks think about bitcoin?
https://preview.redd.it/z7h35tpjdwf11.jpg?width=695&format=pjpg&auto=webp&s=060576eca306b629672a41dafe2175054642fa8f
Representatives of the world's leading banks have repeatedly accused Bitcoin of being a financial pyramid, and its real value is zero. Let's see what caused such a negative critique and how the position of bankers has changed, as the developing of the crypto-currency.
Why are bankers against bitcoin?
The modern financial system operates according to the following principle. States issue national currencies and establish rules for their use. And banks are engaged in storing money and processing payments, but their main task is to monitor the implementation of the rules that the states have established.
So, banks have a monopoly on money disposal. In fact, they use our money for their own purposes and force us to pay for these services. And there is no alternative to this order of things, to be exact, there wasn’t before crypto currency has appeared.
The creator of Bitcoin Satoshi Nakamoto challenged the existing financial system by offering a payment network that is able to work without banks and politicians. Of course, bankers did not like this, because they do not want to lose the right to dispose of other people's money.
Criticism of bitcoin
The media like writing about how politicians and representatives of the current financial system criticize bitcoin. For example, let's take the head of the US financial holding JPMorgan Chase Jamie Daymon, who called bitcoin fraud and promised to fire any of his employee who is engaged in cryptotrading. Because such employees violate the policy of the bank and they are also "dumb" if they contacted with bitcoin.
Also, the billionaire investor Howard Marks was unflattering about bitcoin. From his words, bitcoin has no real value and is an "unreasonable whim".
To discredit Bitcoin, not only public statements were used, but also allegedly financial studies. So analysts of Morgan Stanley banking holding came to the conclusion that the real price of bitcoin was zero.
How has the ratio of banks to crypto-currencies changed?
Gradually critics of bitcoin refuse their words, as Jamie Dimon and Howard Marks have already done. These rich and influential people apologized to the crypto community, but not so much their public statements as real actions are important.
JPMorgan Chase has created a separate unit to develop a crypto-currency strategy and is working on its own blockchain platform. The details are kept secret, but the world's leading bank headed by Jamie Daymon clearly decided not to stay away from crypto market.
Another well-known investment bank Goldman Sachs has invested tens of millions of dollars in the Circle payment service, which offers trading in crypto-currencies and recently bought a large crypto exchange instrument Poloniex.
Asian banks also did not stay aloof. For example, Japan's leading financial company SBI Holding launched its own crypto-exchange VCTRADE.
conclusions
Banks are not going to give green light to bitcoin, so attacks and accusations will continue. And the fact that several prominent representatives of the banking sector have publicly changed their minds - still does not mean anything.
At the same time, bankers see their benefits from being present at the crypto-currency market and take steps to find their own niche. This will attract new capital and customers, which will give the crypto-loans a significant impetus for growth.
In such a situation, it is extremely important that the cryptosystem community maintain its independence and does not fall under the total control of banks. Bear in mind that the real fight is just beginning!
submitted by iTradeBit to u/iTradeBit [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that reddit.com software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Scaling
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
https://personal.cis.strath.ac.uk/neil.ghani/papers/ghani-calco07
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

SYNHROS CRYPTOSYSTEM РЕКЛАМИРУЙ И ЗАРАБАТЫВАЙ ETHEREUM Bitcoin Rally Towards $17,000, FARM Attack, JPMorgan Loves ... Bitcoin Rally Towards $17,000, FARM Attack, JPMorgan Loves ... Asymmetric Key Cryptography ll Information and Cyber Security Course Explained in Hindi Cryptosystem presentation PlatinCoin

READ ALSO: Blaming Bitcoin for Own Failure? UN Official Says Cryptos Make Child Abuse Hard to Probe. Ryuk . Ryuk became famous early in the first quarter, when its outbreak made newspaper headlines in the United States, which forced McAfee to probe into this ransomware family. After thorough investigation, the research deduced that the Ryuk attacks might not necessarily be backed by a nation ... A cryptosystem is an implementation of cryptographic techniques and their accompanying infrastructure to provide information security services. A cryptosystem is also referred to as a cipher system. Let us discuss a simple model of a cryptosystem that provides confidentiality to the information being transmitted. This Comparisons of Bitcoin Cryptosystem with Other Common Internet Transaction Systems by AHP Technique Article (PDF Available) in Journal of Information and Organizational Sciences 41(1):69-87 ... A cryptosystem is a pair of algorithms: one for the encryption of data and another for decryption. Often these algorithms use a unique key which should be kept secret, in which case the process for generating and sharing the key is also considered part of the cryptosystem. Modern cryptography is essential to the digital world we live in and has grown to be quite complex. QC attacks. The most dangerous attack by quantum computers is against public-key cryptography. On traditional computers, it takes on the order of 2 128 basic operations to get the Bitcoin private key associated with a Bitcoin public key. This number is so massively large that any attack using traditional computers is completely impractical. However, it is known for sure that it would take a ...

[index] [16745] [1852] [7342] [12378] [36397] [28838] [34121] [1126] [30370] [21555]

SYNHROS CRYPTOSYSTEM РЕКЛАМИРУЙ И ЗАРАБАТЫВАЙ ETHEREUM

Bitcoin Private and Public Keys Explained Simply - Duration: 12:23. Cloud Money 1,013 views. 12:23. Asymmetric key Cryptography with example ... NEW CHANNEL: https://www.youtube.com/channel/UCH9H... I also play video games! https://www.youtube.com/channel/UCvXj... -----... Bitcoin Live - BCH Halving -Tom Crown Tom Crown 230 watching Live now CRIISP Virtual Business Summit 2020 - Surviving and Thriving in Extraordinary Times Wholesale Investor 149 watching Who are the true enemies of Bitcoin & Is 2020 the year they launch an attack? Governments, Regulators & the old financial system are beginning to crumble. Thus centralized government digital ... Регистрация https://bit.ly/31if1Bc ВНИМАНИЕ!!! ПРЕДСТАРТ НОВОГО ТОПОВОГО ПРОЕКТА #SYNHROS !!! ВХОД ВСЕГО 0.01 ETH Одним из ...

#