Month: April 2024
Analysis-Dollar’s rally supercharged by diverging US rate outlook
Post Content
Asia FX weak as Iran-Israel jitters boost dollar; yen at 34-year lows
Post Content
Bitcoin down by 7.9% to $61,842
Post Content
How did Satoshi Think of Bitcoin?
The following is an essay originally published on Unchained.com by Dhruv Bansal, CSO and Co-founder of Unchained, the Official US Collaborative Custody Partner of Bitcoin Magazine. For more information on services offered, custody products, and the relationship between Unchained and Bitcoin Magazine, please visit our website.
Click here to download a PDF of this 7,000 word essay on the origins of Bitcoin.
Bitcoin is often compared to the internet in the 1990s, but I believe the better analogy is to the telegraph in the 1840s.[1]
The telegraph was the first technology to transmit encoded data at near-light speed over long distances. It marked the birth of the telecommunications industry. The internet, though it is bigger in scale, richer in content, and manyto-many instead of one-to-one, is fundamentally still a telecommunications technology.
Both the telegraph and the internet rely upon business models in which companies deploy capital to build a physical network and then charge users to send messages through this network. AT&T’s network has historically transmitted telegrams, telephone calls, TCP/IP packets, text messages, and now TikToks.
The transformation of society through telecom has led to greater freedoms but also greater centralization. The internet has increased the reach of millions of content creators and small businesses, but has also strengthened the grasp of companies, governments and other institutions well-positioned enough to monitor and manipulate online activity.
But bitcoin is not the end of any transformation— it’s the beginning of one. Like telecommunications, bitcoin will change both human society and daily life. Predicting the full scope of this change today is akin to imagining the internet while living in the era of the telegraph.
This series attempts to imagine this future by starting with the past. This initial article traces the history of digital currencies before bitcoin. Only by understanding where prior projects fell short can we perceive what makes bitcoin succeed—and how it suggests a methodology for building the decentralized systems of the future.
Outline
I. Decentralized systems are markets
II. Decentralized markets require decentralized goods
III. How can decentralized systems price computations?
IV. Satoshi’s monetary policy goals led to bitcoin
V. Conclusion
A central claim of this article is that bitcoin can be thought of as an adaptation of Dai’s b-money project that eliminates the freedom to create money. Just weeks after this article was originally published, new emails surfaced in which Satoshi claimed to be unfamiliar with b-money, yet admitted that bitcoin starts “from exactly that point.” In light of this new evidence, we believe this central claim, while not historically accurate, is still a meaningful and helpful way to think about the origin of bitcoin.
Unchained is the Official Collaborative Custody Partner of Bitcoin Magazine. Click here to learn more about Unchained’s bitcoin financial services and receive exclusive discounts on Unchained vault, Signature and IRA.
How did Satoshi think of bitcoin?
Satoshi was brilliant, but bitcoin didn’t come out of nowhere.
Bitcoin iterated on existing work in cryptography, distributed systems, economics, and political philosophy. The concept of proof-of-work existed long before its use in money and prior cypherpunks such as Nick Szabo, Wei Dai, & Hal Finney anticipated and influenced the design of bitcoin with projects such as bit gold, b-money, and RPOW. Consider that, by 2008, when Satoshi wrote the bitcoin white paper,[2] many of the ideas important to bitcoin had already been proposed and/or implemented:
Digital currencies should be P2P networksProof-of-work is the basis of money creationMoney is created through an auctionPublic key cryptography is used to define ownership & transfer of coinsTransactions are batched into blocksBlocks are chained together through proof-of-workAll blocks are stored by all participants
Bitcoin leverages all these concepts, but Satoshi didn’t originate any of them. To better understand Satoshi’s contribution, we should determine which principles of bitcoin are missing from the list.
Some obvious candidates are the finite supply of bitcoin, Nakamoto consensus, and the difficulty adjustment algorithm. But what led Satoshi to these ideas in the first place?
This article explores the history of digital currencies and makes the case that Satoshi’s focus on sound monetary policy is what led bitcoin to surmount challenges that defeated prior projects such as bit gold and b-money.
I. Decentralized systems are markets
Bitcoin is often described as a decentralized or distributed system. Unfortunately, the words “decentralized” and “distributed” are frequently confused. When applied to digital systems, both terms refer to ways a monolithic application can be decomposed into a network of communicating pieces.
For our purposes, the major difference between decentralized and distributed systems is not the topology of their network diagrams, but the way they enforce rules. We take some time in the following section to compare distributed and decentralized systems and motivate the idea that robust decentralized systems are markets.
Distributed systems rely upon central authorities
In this work, we take “distributed” to mean any system that has been broken up into many parts (often referred to as “nodes”) which must communicate, typically over a network.
Software engineers have grown adept at building globally distributed systems. The internet is composed of distributed systems collectively containing billions of nodes. We each have a node in our pocket that both participates in and relies upon these systems.
But almost all the distributed systems we use today are governed by some central authority, typically a system administrator, company, or government that is mutually trusted by all nodes in the system.
Central authorities ensure all nodes adhere to the system s rules and remove, repair, or punish nodes that fail to do so. They are trusted to provide coordination, resolve conflicts, and allocate shared resources. Over time, central authorities manage changes to the system, upgrading it or adding features, and ensuring that participating nodes comply with the changes.
The benefits a distributed system gains from relying upon a central authority come with costs. While the system is robust against failures of its nodes, a failure of its central authority may cause it to stop functioning overall. The ability for the central authority to unilaterally make decisions means that subverting or eliminating the central authority is sufficient to control or destroy the entire system.
Despite these trade-offs, if there is a requirement that a single party or coalition must retain central authority, or if the participants within the system are content with relying upon a central authority, then a traditional distributed system is the best solution. No blockchain, token, or similar decentralized dressing is required.
In particular, the case of a VC- or government-backed cryptocurrency, with requirements that a single party can monitor or restrict payments and freeze accounts, is the perfect use case for a traditional distributed system.
Decentralized systems have no central authorities
We take “decentralized” to have a stronger meaning than “distributed”: decentralized systems are a subset of distributed systems that lack any central authority. A close synonym for “decentralized” is “peer-to-peer” (P2P).
Removing central authority confers several advantages. Decentralized systems:
Grow quickly because they lack barriers to entry—anyone can grow the system by simply running a new node, and there is no requirement for registration or approval from the central authority.Are robust because there is no central authority whose failure can compromise the functioning of the system. All nodes are the same, so failures are local and the network routes around damage.Are difficult to capture, regulate, tax, or surveil because they lack centralized points of control for governments to subvert.
These strengths are why Satoshi chose a decentralized, peer-to-peer design for bitcoin:
“Governments are good at cutting off the heads of… centrally controlled networks like Napster, but pure P2P networks like Gnutella and Tor seem to be holding their own.” – Nakamoto, 2008
But these strengths come with corresponding weaknesses. Decentralized systems can be less efficient as each node must additionally bear responsibilities for coordination previously assumed by the central authority.
Decentralized systems are also plagued by scammy, adversarial behavior. Despite Satoshi’s nod to Gnutella, anyone who’s used a P2P file sharing program to download a file that turned out to be something gross or malicious understands the reasons that P2P file sharing never became the mainstream model for data transfer online.
Satoshi didn’t name it explicitly, but email is another decentralized system that has evaded government controls. And email is similarly notorious for spam.
Decentralized systems are governed through incentives
The root problem, in all of these cases, is that adversarial behavior (seeding bad files, sending spam emails) is not punished, and cooperative behavior (seeding good files, only sending useful emails) is not rewarded. Decentralized systems that rely upon their participants to be good actors fail to scale because they cannot prevent bad actors from also participating.
Without imposing a central authority, the only way to solve this problem is to use economic incentives. Good actors, by definition, play by the rules because they’re inherently motivated to do so. Bad actors are, by definition, selfish and adversarial, but proper economic incentives can redirect their bad behavior towards the common good. Decentralized systems that scale do so by ensuring that cooperative behavior is profitable and adversarial behavior is costly.
The best way to implement robust decentralized services is to create markets where all actors, both good and bad, are paid to provide that service. The lack of barriers to entry for buyers and sellers in a decentralized market encourages scale and efficiency. If the market’s protocols can protect participants from fraud, theft, and abuse, then bad actors will find it more profitable to either play by the rules or go attack a different system.
II. Decentralized markets require decentralized goods
But markets are complex. They must provide buyers and sellers the ability to post bids & asks as well as discover, match and settle orders. They must be fair, provide strong consistency, and maintain availability despite periods of volatility.
Global markets today are extremely capable and sophisticated, but using traditional goods and payment networks to implement incentives in a decentralized market is a nonstarter. Any coupling between a decentralized system and fiat money, traditional assets, or physical commodities would reintroduce dependencies on the central authorities that control payment processors, banks, & exchanges.
Decentralized systems cannot transfer cash, look up the balance of a brokerage account, or determine the ownership of property. Traditional goods are completely illegible from within a decentralized system. The inverse is not true—traditional systems can interact with bitcoin as easily as any other actor (once they decide they want to). The boundary between traditional and decentralized systems is not an impassable wall, but a semi-permeable membrane.
This means that decentralized systems cannot execute payments denominated in any traditional good. They cannot even determine the balances of fiat-dominated accounts or the ownership of real estate or physical goods. The entire traditional economy is completely illegible from within decentralized systems.
Creating decentralized markets requires trading new kinds of decentralized goods which are legible and transferable within decentralized systems.
Computation is the first decentralized good
The first example of a “decentralized good” is a special class of computations first proposed in 1993 by Cynthia Dwork and Moni Naor.[3]
Because of deep connections between mathematics, physics, and computer science, these computations cost real-world energy and hardware resources—they cannot be faked. Since real-world resources are scarce, these computations are also scarce.
The input for these computations can be any kind of data. The resulting output is a digital “proof” that the computations were performed on the given input data. Proofs contain a given “difficulty” which is (statistical) evidence of a given amount of computational work. Most importantly, the relationship between the input data, the proof, and the original computational work performed can be independently verified without appeal to any central authority.
The idea of passing around some input data along with a digital proof as evidence of real-world computational work performed on that input is now called “proof-of-work”.[4] Proofs-of-work are, to use Nick Szabo’s phrase, “unforgeable costliness”. Because proofs-of-work are verifiable by anyone, they are economic resources that are legible to all participants in a decentralized system. Proofs-of-work turn computations on data into decentralized goods. Dwork & Naor proposed using computations to limit the abuse of a shared resource by forcing participants to provide proofsof-work with a certain minimum difficulty before they can access the resource:
“In this paper we suggest a computational approach to combatting the proliferation of electronic mail. More generally, we have designed an access control mechanism that can be used whenever it is desirable to restrain, but not prohibit, access to a resource.” – Dwoak & Naor, 1993
In Dwork & Naor’s proposal, an email system administrator would set a minimum proof-of-work difficulty for delivering email. Users wanting to send email would need to perform a corresponding number of computations with that email as the input data. The resulting proof would be submitted to the server alongside any request to deliver the email.
Dwork & Naor referred to the difficulty of a proofof-work as a “pricing function” because, by adjusting the difficulty, a “pricing authority” could ensure that the shared resource remained cheap to use for honest, average users but expensive for users seeking to exploit it. In the email delivery market, server administrators are the pricing authorities; they must choose a “price” for email delivery which is low enough for normal usage but too high for spam.
Though Dwork & Naor framed proofs-of-work as an economic disincentive to combat resource abuse, the nomenclature “pricing function” and “pricing authority” supports a different, marketbased interpretation: users are purchasing access to a resource in exchange for computations at a price set by the resource’s controller.
In this interpretation, an email delivery network is really a decentralized market trading email delivery for computations. The minimum difficulty of a proof-of-work is the asking price for email delivery denominated in the currency of computations.
Currency is the second decentralized good
But computations aren’t a good currency.
The proofs used to “trade” computations are only valid for the input used in those computations. This unbreakable lilnk between a specific proof and a specific input means that the proof-of-work for one input can’t be reused for a different input.
Proof-of-work was originally proposed as an access control mechanism for limiting spam emails. Users would be expected to provide proofs-of-work alongside any emails they wanted to send. This mechanism can also be thought of as a market where users are purchasing email deliveries with computations at a price chosen by the email service provider.
This constraint is useful – it can be used to prevent the work done by one buyer in the market from being re-spent by another. For example, HashCash, the first real implementation of the market for email delivery, included metadata such as the current timestamp and the sender’s email address in the input data to its proof-of-work computations. Proofs produced by a given user for a given email can’t be respent for sending a different email.
But this also means that proof-of-work computations are bespoke goods. They aren’t fungible, they can’t be re-spent,[5] and they don’t solve the coincidence-of-wants problem. These missing monetary properties prevent computations from being currency. Despite the name, there is no incentive for an email delivery provider to want to accumulate HashCash, as there would be for actual cash.
Adam Back, inventor of HashCash, understood these problems:
“hashcash is not directly transferable because to make it distributed, each service provider accepts payment only in cash created for them. You could perhaps setup a digicash style mint (with chaumian ecash) and have the bank only mint cash on receipt of hash collisions addressed to it. However this means you’ve got to trust the bank not to mint unlimited amounts of money for it’s own use.” – Adam Back, 1997
We don’t want to exchange bespoke computations for every individual good or service sold in a decentralized economy. We want a general purpose digital currency that can directly be used to coordinate exchanges of value in any market.
Building a functioning digital currency while remaining decentralized is a significant challenge. A currency requires fungible units of equal value that can be transferred among users. This requires issuance models, cryptographic definitions of ownership and transfer, a discovery and settlement process for transactions, and a historical ledger. None of this infrastructure is required when proof-of-work is thought of as a mere “access control mechanism”.
Moreover, decentralized systems are markets, so all these basic functions of a currency must somehow be provided through paying service providers…in the units of the currency that’s being created!
Like compiling the first compiler, a black start of the electrical grid, or the evolution of life itself, the creators of digital currencies were confronted with a bootstrapping problem: how to define the economic incentives that underlie a functioning currency without having a functioning currency in which to denominate or pay those incentives.
Computations and currency are the first and second goods in decentralized markets. Proof-of-work alone allows for the exchange of computations but a functioning currency requires more infrastructure. It took 15 years for the cypherpunk community to develop that infrastructure.
The first decentralized market must trade computations for currency
Progress on this bootstrapping problem comes from properly framing its constraints.
Decentralized systems must be markets. Markets consist of buyers and sellers exchanging goods. The decentralized market for a digital currency only has two goods that are legible within it:
Computations through proof-of-workUnits of the currency we’re trying to build
The only market trade possible must therefore be between these two goods. Computations must be sold for units of currency orF equivalentlyF units of currency must be sold for computations. Stating this is easy—the hard part is structuring this market so that simply exchanging currency for computation bootstraps all the capabilities of the currency itself!
The entire history of digital currencies culminating in Satoshi’s 2008 white paperF was a series of increasingly sophisticated attempts at structuring this market. The following section reviews projects such as Nick Szabo’s bit gold and Wei Dai’s b-money. Understanding how these projects structured their marketsF and why they failed will help us frame why Satoshi and bitcoin succeeded.
III. How can decentralized systems price computations?
A major function of markets is price discovery. A market trading computations for currency must therefore discover the price of computation itself, as denominated in units of that currency.
We don’t typically assign monetary value to computations. We typically value the capacity to perform computations because we value the output of computations, not the computations themselves. If the same output can be performed more efficiently, with fewer computations, that is usually called “progress”.
Proofs-of-work represent specific computations whose only output is proof that they were performed. Producing the same proof by performing fewer computations and less work wouldn’t be progress—it would be a bug. The computations associated with proofs-of-work are thus a strange and novel good to attempt to value.
When proofs-of-work are thought of as disincentives against resource abuse, it is not necessary to value them precisely or consistently. All that matters is that the email service provider sets difficulties low enough to be unnoticeable for legitimate users yet high enough to be prohibitive for spammers. There is thus a broad range of acceptable “prices” and each participant acts as their own pricing authority, applying a local pricing function.
But units of a currency are meant to be fungible, each having the same value. Due to changes in technology over time, two units of currency created with the same proof-of-work difficulty— as measured by the number of corresponding computations—may have radically different realworld costs of production, as measured by the time, energy, and/or capital to perform those computations . When computations are sold for currency, and the underlying cost of production is variable, how can the market ensure a consistent price?
Nick Szabo clearly identified this pricing problem when describing bit gold:
“The main problem…is that proof of work schemes depend on computer architecture, not just an abstract mathematics based on an abstract “compute cycle.” …Thus, it might be possible to be a very low cost producer (by several orders of magnitude) and swamp the market with bit gold.” – Szabo, 2005
A decentralized currency created through proof-of-work will experience supply gluts and crashes as the supply of computations changes over time. To accommodate this volatility, the network must learn to dynamically price computations.
Early digital currencies attempted to price computations by attempting to collectively measure the “cost of computing”. Wei Dai, for example, proposes the following hand-wavy solution in b-money:
‘The number of monetary units created is equal to the cost of the computing effort in terms of a standard basket of commodities. For example if a problem takes 100 hours to solve on the computer that solves it most economically, and it takes 3 standard baskets to purchase 100 hours of computing time on that computer on the open market, then upon the broadcast of the solution to that problem everyone credits the broadcaster’s account by 3 units.” – Dai, 1998
Unfortunately, Dai does not explain how users in a supposedly decentralized system are supposed to agree upon the definition of a “standard basket”, which computer solves a given problem “most economically”, or the cost of computation on the “open market”. Achieving consensus among all users about a time-varying shared dataset is the essential problem of decentralized systems!
To be fair to Dai, he realized this:
“One of the more problematic parts in the b-money protocol is money creation. This part of the protocol requires that all [users] decide and agree on the cost of particular computations. Unfortunately because computing technology tends to advance rapidly and not always publicly, this information may be unavailable, inaccurate, or outdated, all of which would cause serious problems for the protocol.” – Dai, 1998
Dai would go on to propose a more sophisticated auction-based pricing mechanism which Satoshi would later say was the starting point for his ideas. We will return to this auction scheme below, but first let’s turn to bit gold, and consider Szabo’s insights into the problem.
Use external markets
Szabo claims that proofs-of-work should be “securely timestamped”:
“The proof of work is securely timestamped. This should work in a distributed fashion, with several different timestamp services so that no particular timestamp service need be substantially relied on.” – Szabo, 2005
Szabo links to a page of resources on secure timestamping protocols but does not describe any specific algorithm for secure timestamping. The phrases “securely” and “distributed fashion” are carrying a lot of weight here, hand-waving through the complexities of relying upon one (or many) “outside the system” services for timestamping.[6]
The time a unit of digital currency was created is important because it links the computations performed to real-world production cost.
Regardless of implementation fuzziness, Szabo was right—the time a proof-of-work was created is an important factor in pricing it because it is related to the cost of computation:
“…However, since bit gold is timestamped, the time created as well as the mathematical difficulty of the work can be automatically proven. From this, it can usually be inferred what the cost of producing during that time period was…” – Szabo, 2005
“Inferring” the cost of production is important because bit gold has no mechanism to limit the creation of money. Anyone can create bit gold by performing the appropriate computations. Without the ability to regulate issuance, bit gold is akin to a collectible:
“…Unlike fungible atoms of gold, but as with collector s items, a large supply during a given time period will drive down the value of those particular items. In this respect bit gold acts more like collector s items than like gold…” – Szabo, 2005
Bit gold requires an additional, external process to create fungible units of currency:
“…[B]it gold will not be fungible based on a simple function of, for example, the length of the string. Instead, to create fungible units dealers will have to combine different-valued pieces of bit gold into larger units of approximately equal value. This is analogous to what many commodity dealers do today to make commodity markets possible. Trust is still distributed because the estimated values of such bundles can be independently verified by many other parties in a largely or entirely automated fashion.” – Szabo, 2005
To paraphrase Szabo, “to assay the value of… bit gold, a dealer checks and verifies the difficulty, the input, and the timestamp”. The dealers defining “larger units of approximately equal value” are providing a similar pricing function as Dai’s “standard basket of commodities”. Fungible units are not created in bit gold when proofs-ofwork are produced, only later when those proofs are combined into larger “units of approximately equal value” by dealers in markets outside the network.
To his credit, Szabo recognizes this flaw:
“…The potential for initially hidden supply gluts due to hidden innovations in machine architecture is a potential flaw in bit gold, or at least an imperfection which the initial auctions and ex post exchanges of bit gold will have to address.” – Szabo, 2005
Again, despite not having arrived at (what we now know as) the solution, Szabo was pointing us at it: because the cost of computation changes over time, the network must respond to changes in the supply of computation by adjusting the price of money.
Use internal markets
Szabo’s dealers would have been an external market that defined the price of (bundles of) bit gold after its creation. Is it possible to implement this market within the system instead of outside it?
Let’s return to Wei Dai and b-money. As mentioned earlier, Dai proposed an alternative auction-based model for the creation of bmoney. Satoshi’s design for bitcoin improves directly on bmoney’s auction model[7]:
“So I propose an alternative money creation subprotocol, in which [users]… instead decide and agree on the amount of b-money to be created each period, with the cost of creating that money determined by an auction. Each money creation period is divided up into four phases, as follows:
Planning. The [users] compute and negotiate with each other to determine an optimal increase in the money supply for the next period. Whether or not the [network] can reach a consensus, they each broadcast their money creation quota and any macroeconomic calculations done to support the figures.
Bidding. Anyone who wants to create b-money broadcasts a bid in the form of where x is the amount of b-money he wants to create, and y is an unsolved problem from a predetermined problem class. Each problem in this class should have a nominal cost (in MIPS-years say) which is publicly agreed on.
Computation. After seeing the bids, the ones who placed bids in the bidding phase may now solve the problems in their bids and broadcast the solutions. Money creation.
Money creation. Each [user] accepts the highest bids (among those who actually broadcasted solutions) in terms of nominal cost per unit of bmoney created and credits the bidders accounts accordingly.” Dai, 1998
B-money makes significant strides towards the correct market structure for a digital currency. It attempts to eliminate Szabo’s external dealers and allow users to engage in price discovery by directly bidding against each other.
But implementing Dai’s proposal as written would be challenging:
In the “Planning” phase, users bear the burden of negotiating the “optimal increase in the money supply for the next period”. How “optimal” should be defined, how users should negotiate with each other, and how the results of such negotiations are shared is not described.Regardless of what was planned, the “Bidding” phase allows anyone to submit a “bid” to create b-money. The bids include both an amount of b-money to be created as well as a corresponding amount of proofof-work so each bid is a price, the number of computations for which a given bidder is willing to perform in order to buy a given amount of b-money.Once bids are submitted, the “computation” phase consists of bidders performing the proof-of-work they bid and broadcasting solutions. No mechanisms for matching bidders to solutions is provided. More problematically, it’s not clear how users should know that all bids have been submitted – when does the “Bidding” phase end and the “computation” phase begin?These problems recur in the “Money ]reation” phase. Because of the nature of proof-of-work, users can verify the proofs they receive in solutions are real. But how can users collectively agree on the set of “highest bids”? What if different users pick different such sets, either due to preference or network latency?
Decentralized systems struggle to track data and make choices consistently, yet b-money requires tracking bids from many users and making consensus choices among them. This complexity prevented b-money from ever being implemented.
The root of this complexity is Dai’s belief that the “optimal” rate at which b-money is created should fluctuate over time based on the “macroeconomic calculations” of its users. Like bit gold, b-money has no mechanism to limit the creation of money. Anyone can create units of b-money by broadcasting a bid and then doing the corresponding proof-of-work.
Both Szabo and Dai proposed using a market exchanging digital currency for computations yet neither bit gold nor b-money defined a monetary policy to regulate the supply of currency within this market.
Visit Unchained.BitcoinMagazine.com to access educational content focused on collaboartive custody and financial services as well as tools to upgrade your bitcoin security.
IV. Satoshi’s monetary policy goals led to bitcoin
In contrast, a sound monetary policy was one of Satoshi’s primary goals for the bitcoin project. In the very first mailing list post where bitcoin was announced, Satoshi wrote:
“The root problem with conventional currency is all the trust that’s required to make it work. The central bank must be trusted not to debase the currency, but the history of fiat currencies is full of breaches of that trust.” – Satoshi, 2009
Satoshi would go on to describe other problems with fiat currencies such as risky fractional reserve banking, a lack of privacy, rampant theft & fraud, and the inability to make micropayments. But Satoshi started with the issue of debasement by central banks—with a concern about monetary policy.
Satoshi wanted bitcoin to ultimately reach a finite circulating supply that cannot be diluted over time. The “optimal” rate of bitcoin creation, for Satoshi, should thus eventually be zero.
This monetary policy goal, more than any other characteristic they personally (or collectively!) possessed, was the reason Satoshi “discovered” bitcoin, the blockchain, Nakamoto consensus, etc. —and not someone else. It’s the short answer to the question posed in the title of this article: Satoshi thought of bitcoin because they were focused on creating a digital currency with a finite supply.
A finite supply of bitcoin is not only a monetary policy goal or a meme for bitcoiners to rally around. It’s the essential technical simplification that allowed Satoshi to build a functional digital currency while Dai’s b-money remained just a fascinating web post.
Bitcoin is b-money with an additional requirement of a predetermined monetary policy. Like many technical simplifications, constraining monetary policy enables progress by reducing scope. Let’s see how each of the phases of b-money creation is simplified by imposing this constraint.
All 21M bitcoin already exist
In b-money, each “money creation period” included a “Planning” phase, in which users were expected to share their “macroeconomic calculations” justifying the amount of b-money they wanted to create at that time. Satoshi’s monetary policy goals of a finite supply and zero tail emission were incompatible with the freedom granted by b-money to individual users to create money. The first step on the journey from bmoney to bitcoin was therefore to eliminate this freedom. Individual bitcoin users cannot create bitcoin. Only the bitcoin network can create bitcoin, and it did so exactly once, in 2009 when Satoshi launched the bitcoin project.
Satoshi was able to replace the recurring “Planning” phases of b-money into a single, predetermined schedule on which the 21M bitcoin created in 2009 would be released into circulation. Users voluntarily endorse Satoshi’s monetary policy by downloading and running the Bitcoin Core software in which this monetary policy is hard-coded.
This changes the semantics of bitcoin’s market for computations. The bitcoin being paid to miners is not newly issued; it’s newly released into circulation from an existing supply.
This framing is crucially different from the naive claim that “bitcoin miners create bitcoin”. Bitcoin miners are not creating bitcoin, they’re buying it. Bitcoin isn’t valuable because “bitcoin are made from energy”—bitcoin’s value is demonstrated by being sold for energy.
Let’s repeat it one more time: bitcoin isn’t created through proof-of-work, bitcoin is created through consensus.
Satoshi’s design eliminates the requirement for ongoing “Planning” phases from b-money by doing all the planning up front. This allowed Satoshi to hard-code a sound monetary policy but also simplified the implementation of bitcoin.
Bitcoin is priced through consensus
This freedom granted to users to create money results in a corresponding burden for the bmoney network. During the “Bidding” phase the b-money network must collect and share money creation “bids” from many different users.
Eliminating the freedom to create money relieves the bitcoin network of this burden. Since all 21M bitcoin already exist, the network doesn’t need to collect bids from users to create money, it merely has to sell bitcoin on Satoshi’s predetermined schedule.
The bitcoin network thus offers a consensus asking price for the bitcoin it is selling in each block. This single price is calculated by each node independently using its copy of the blockchain. If nodes have consensus on the same blockchain (a point we will return to later) they will all offer an identical asking price at each block.[8]
The first half of the consensus price calculation determines how many bitcoin to sell. This is fixed by Satoshi’s predetermined release schedule. All bitcoin nodes in the network calculate the same amount for a given block:
The second half of the consensus asking price is the number of computations the current subsidy is being sold for. Again, all bitcoin nodes in the network calculate the same value (we will revisit this difficulty calculation in the next section):
Together, the network subsidy and difficulty define the current asking of bitcoin as denominated in computations. Because the blockchain is in consensus, this price is a consensus price.
Users in b-money also were presumed to have a consensus “blockchain” containing the history of all transactions. But Dai never thought of the simple solution of a single consensus asking price for the creation of new b-money, determined solely by the data in that blockchain.
Instead, Dai assumed that money creation must go on forever. Individual users would therefore need to be empowered to affect monetary policy – just as in fiat currencies. This perceived requirement led Dai to design a bidding system which prevented b-money from being implemented.
This added complexity was removed by Satoshi’s requirement of a predetermined monetary policy.
Time closes all spreads
In the “Computation” phase of b-money, individual users would perform the computations they’d committed to in their prior bids. In bitcoin, the entire network is the seller – but who is the buyer?
In the email delivery market, the buyers were individuals wanting to send emails. The pricing authority, the email service provider, would set a price that was considered cheap for individuals but expensive for spammers. But if the number of legitimate users increased, the price could still remain the same because the computing power of individual users would have remained the same.
In b-money, each user who contributed a bid for money creation was supposed to subsequently perform the corresponding number of computations themselves. Each user was acting as their own pricing authority based on their knowledge of their own computing capabilities.
The bitcoin network offers a single asking price in computations for the current bitcoin subsidy. But no individual miner who finds a block has performed this number of computations.[9] The individual miner’s winning block is proof that all miners collectively performed the required number of computations. The buyer of bitcoin is thus the global bitcoin mining industry.
Having arrived at a consensus asking price, the bitcoin network will not change that price until more blocks are produced. These blocks must contain proofs-of-work at the current asking price. The mining industry therefore has no choice if it wants to “execute a trade” but to pay the current asking price in computations.
The only variable the mining industry can control is how long it will take to produce the next block. Just as the bitcoin network offers a single asking price, the mining industry thus offers a single bid—the time it takes to produce the next block meeting the network’s current asking price.
To compensate for increasing hardware speed and varying interest in running nodes over time, the proof-of-work difficulty is determined by a moving average targeting an average number of blocks per hour. If they’re generated too fast, the difficulty increases. – Nakamoto, 2008
Satoshi is modestly describing the difficulty adjustment algorithm, often cited as one of the most original ideas in bitcoin’s implementation. This is true, but instead of focusing on the inventiveness of the solution, let’s instead focus on why solving the problem was so important to Satoshi in the first place.
Projects such as bit gold and b-money didn’t need to constrain the rate in time of money creation because they didn’t have a fixed supply or a predetermined monetary policy. Periods of faster or slower money creation could be compensated for through other means, e.g. external dealers putting bit gold tokens into larger or smaller bundlers or b-money users changing their bids.
But Satoshi’s monetary policy goals required bitcoin to have a predetermined rate at which bitcoin was to be released for circulation. Constraining the (statistical) rate at which blocks are produced over time is natural in bitcoin because the rate of block production is the rate at which the initial supply of bitcoin is being sold. Selling 21M bitcoin over 140 years is a different proposition than allowing it to be sold in 3 months.
Moreover, bitcoin can actually implement this constraint because the blockchain is Szabo’s “secure timestamping protocol.” Satoshi describes bitcoin as first and foremost a “distributed timestamp server on a peer-to-peer basis,” and early implementations of the bitcoin source code use the world “timechain” rather than “blockchain” to describe the shared data structure that implements bitcoin’s proof-of-work market.[10]
Unlike bit gold or b-money, tokens in bitcoin do not experience supply gluts. The bitcoin network uses the difficulty adjustment to change the price of money in response to changes in the supply of computations.
Bitcoin’s difficulty readjustment algorithm leverages this capability. The consensus blockchain is used by participants to enumerate the historical bids made by the mining industry and readjust the difficulty in order to move closer to the target block time.
A standing order creates consensus
The chain of simplifications caused by demanding strong monetary policy extends to the “Money creation” phase of b-money.
User-submitted bids in b-money suffer from “nothing at stake” problem. There is no mechanism to prevent users from submitting bids with a huge amount of b-money for very little work. This requires the network to both track which bids have been completed and only accept the “highest bids…in terms of nominal cost per unit of b-money created” in order to avoid such nuisance bids. Each b-money participant must track an entire order book worth of bids, match bids with their subsequent computations, and only settle such completed orders with the highest prices.
This problem is an instance of the more general problem of consensus in decentralized systems, also known as the “Byzantine generals” or sometimes the “double-spend” problem in the context of digital currencies. Sharing an identical sequence of data among all participants is challenging inside an adversarial, decentralized network. Existing solutions to this problem – socalled “Byzantine-fault tolerant (BFT) consensus algorithms”—require previous coordination among participants or a supermajority (>67%) of participants to not behave adversarially.
Bitcoin doesn’t have to manage a large order book of bids because the bitcoin network offers a single consensus asking price. This means bitcoin nodes can accept the first (valid) block they see that meets the network’s current asking price— nuisance bids can easily be ignored and are a waste of a miner’s resources.
Consensus pricing of computations allows the matching of buy/sell orders in bitcoin to be done eagerly, on a first-come, first-served basis. Unlike b-money, this eager order matching means that bitcoin’s market has no phases—it operates continuously, with a new consensus price being calculated after each individual order is matched (block is found). To avoid forks caused by network latency or adversarial behavior, nodes must also follow the heaviest chain rule. This greedy order settling rule ensures that only the highest bids are accepted by the network.
This combination eager-greedy algorithm, where nodes accept the first valid block they see and also follow the heaviest chain, is a novel BFT algorithm which rapidly converges on consensus about the sequence of blocks. Satoshi spends 25% of the bitcoin white paper demonstrating this claim.[11]
We established in previous sections that bitcoin’s consensus asking price itself depends on the blockchain being in consensus. But it turns out that the existence of a single consensus asking price is what allows the market for computations to eagerly match orders, which is what leads to consensus in the first place!
Moreover, this new “Nakamoto consensus” only requires 50% of participants to not be adversarial, a significant improvement on the prior state of the art. A cypherpunk like Satoshi made this theoretical computer science breakthrough, instead of a traditional academic or industry researcher, because of their narrow focus on implementing sound money, rather than a generic consensus algorithm for distributed computing.
IV. Conclusion
B-money was a powerful framework for building a digital currency but one that was incomplete because it lacked a monetary policy. Constraining b-money with a predetermined release schedule for bitcoins reduced scope and simplified implementation by eliminating the requirement to track and choose among user-submitted money creation bids. Preserving the temporal pace of Satoshi’s release schedule led to the difficulty adjustment algorithm and enabled Nakamoto consensus, widely recognized as one of the most innovative aspects of bitcoin’s implementation.
There is a lot more to bitcoin’s design than the aspects discussed so far. We have focused this article on the “primary” market within bitcoin, the market which distributes the initial bitcoin supply into circulation.
The next article in this series will explore the market for bitcoin transaction settlement and how it relates to the market for distributing the bitcoin supply. This relationship will suggest a methodology for how to build future markets for decentralized services on top of bitcoin.
To continue your Bitcoin education, click here to download the full report: “How to Position for the Bitcoin Boom” by Tuur Demeester, prepared for Unchained
Acknowledgements
I’ve been ranting about bitcoin and markets for years now and must thank the many people who listened and helped me sharpen my thinking. In particular, Ryan Gentry, Will Cole and Stephen Hall met with me weekly to debate these ideas. I would not have been able to overcome countless false starts without their contributions and their support. Ryan also helped me begin talking about these ideas publicly in our Bitcoin 2021 talk. Afsheen Bigdeli, Allen Farrington, Joe Kelly, Gigi, Tuur Demeester, and Marty Bent, have all encouraged me over the years and provided valuable feedback. I must also apologize to Allen for turning out to be such a lousy collaborator. Finally, Michael Goldstein may be better known for his writing & memes, but I’d like to thank him for the archival work he does at the Nakamoto Institute to keep safe the history of digital currencies.
Footnotes
[1] The title of this series is taken from the first telegraph message in history, sent by Samuel Morse in 1844: “What hath God wrought?”.
[2] Bitcoin: A Peer-to-Peer Electronic Cash System, available: https://bitcoin.org/bitcoin.pdf
[3] Pricing via Processing or Combatting Junk Mail by Dwork and Naor available: https://www.wisdom.weizmann.ac.il/~naor/PAPERS/pvp.pdf
[4] Despite originating the idea, Dwork & Naor did not invent “proof-of-work”—that moniker was provided later in 1999 by Markus Jakobsson and Ari Juels.
[5] Hal Finney’s RPOW project was an attempt at creating transferable proofs-of-work but bitcoin doesn’t use this concept because it doesn’t treat computations as currency. As we’ll see later when we examine bit gold and b-money, computations cannot be currency because the value of computations changes over time while units of currency must have equal value. Bitcoin is not computations, bitcoin is currency that is sold for computations.
[6] At this juncture, some readers may believe me dismissive of the contributions of Dai or Szabo because they were inarticulate or hand-wavy on some points. My feelings are the exact opposite: Dai and Szabo were essentially right and the fact that they did not articulate every detail the way Satoshi subsequently did does not detract from their contributions. Rather, it should heighten our appreciation of them, as it reveals how challenging the advent of digital currency was, even for its best practitioners.
[7] Dai’s b-money post is the very first reference in Satoshi’s white paper, available: http://www.weidai.com/bmoney.txt
[8]There are two simplifications being made here:
a. The number of bitcoin being sold in each block is also affected by the transaction fee market, which is out of scope for this article, though lookout for subsequent work.
b. The difficulty as reported by bitcoin is not exactly the number of expected computations; one must multiply by a proportionality factor.
[9] At least not since the bad old days when Satoshi was the only miner on the network. [10] Gigi’s classicBitcoin is Timeis a great introduction to the deep connections between bitcoin and time, available: https://dergigi.com/2021/01/14/bitcoin-is-time/
[11] Satoshi blundered both in their analysis in the white paper and their subsequent initial implementation of bitcoin by using the“longest chain” rule instead of the “heaviest chain” rule.
Who Will Be The Next Spot Bitcoin ETF Issuer To Support BTC Developers After Bitwise And VanEck?
Of the 11 financial institutions that issued spot Bitcoin ETFs in January 2024, only two — Bitwise and VanEck — have pledged to donate a percentage of their profits to open-source Bitcoin development.
Regardless of whether large holders are harassing ETF sponsors for funding core dev, the real question is – why are only @BitwiseInvest and @vaneck_us doing so? Bitcoin is an ongoing project and core dev continues to be underfunded
— nic “bankful” carter (@nic__carter) April 10, 2024
In considering the logic behind Bitwise and VanEck’s decision to donate to developers who maintain and update the Bitcoin protocol, it’s difficult to imagine why more spot Bitcoin ETF issuers haven’t followed suit.
“While we use the language of ‘donation’ when we support devs, I think in reality it’s closer to a self-investment into making the asset itself stronger,” Hong Kim, co-founder and CTO of Bitwise, wrote in an AMA thread on Stacker News. “Many people think bitcoin just magically gets maintained, but that’s not true! If you manage a large pool of bitcoin and you take fees for doing so, then why would you not reinvest some of that into the underlying infrastructure?”
For this reason, Bitwise, which issued its spot Bitcoin ETF under the name Bitwise Bitcoin ETF (ticker: BITB), committed to donating 10% of its ETF fee profits to three different nonprofits that fund Bitcoin Core developers — OpenSats, Brink and the Human Rights Foundation (HRF) — for 10 years.
“Brink, OpenSats and HRF were the most established nonprofits with a track record of funding Bitcoin devs — they had the proof of work, so to speak,” Kim told Bitcoin Magazine.
VanEck, which issued its spot Bitcoin ETF under the name VanEck Bitcoin Trust (ticker: HODL), also sees the value in supporting Bitcoin Core developers. Hence, it promised to contribute 5% of HODL profits to Brink as well as make an initial $10,000 donation to the organization.
“We believe TradFi stands to gain from the efforts of Bitcoin’s Core contributors,” Matthew Sigel, Head of Digital Asset Research at VanEck, told Bitcoin Magazine.
“As we stand to profit from Bitcoin’s price increase, it makes sense that we also give back to the work of the innovators who make the chain possible,” he added.
Given that it’s only sensible for spot Bitcoin ETF issuers to give back Bitcoin Core developers — those who support and further the underlying asset for their financial product — which will be next to follow Bitwise and VanEck’s lead?
The development of Bitcoin and open-source scaling solutions for the protocol could benefit significantly from more of these major financial institutions donating even a small portion of the profits from their spot Bitcoin ETF fees.
6 Common Pitfalls of Self-Directed and Checkbook Bitcoin IRAs
Originally published on Unchained.com.
Unchained is the official US Collaborative Custody partner of Bitcoin Magazine and an integral sponsor of related content published through Bitcoin Magazine. For more information on services offered, custody products, and the relationship between Unchained and Bitcoin Magazine, please visit our website.
You don’t often see the term “Roth IRA” trending online, but in 2021, tech investor Peter Thiel made headlines for his $5 billion tax-free Roth IRA piggy bank. How did he do it? The answer is alternative investments. He used a self-directed IRA to invest in early-stage tech companies multiple times over. Is it a loophole? Possibly. But it happened, it got attention, and the IRA structure in question could come under further scrutiny.
“Thiel has taken a retirement account worth less than $2,000 in 1999 and spun it into a $5 billion windfall.” – ProPublica (2021)
Let’s look at six common risks associated with self-directed and checkbook IRAs, how they may apply in the context of bitcoin, and why there may be increased regulation coming in the future. But first, we need to define our terms and differentiate between IRA structures.
The different IRA structures
The different IRA structures can behave in an “every square is a rectangle, but not all rectangles are squares” kind of way. IRAs can be Traditional (pre-tax) or Roth (post-tax) regardless of custodial relationship/structure. All IRAs are custodial. A custodian, in the context of IRAs, is a licensed financial institution overseeing and administering the IRA.
Brokerage and Bank IRAs
Brokerage and bank IRAs are the most familiar and common types. Brokerage and Bank IRAs allow investors to invest in stocks, bonds, ETFs, mutual funds, and other securities, as well as banking products (CDs, deposit accounts, etc.). Examples include your typical Fidelity, TD Ameritrade, or Charles Schwab IRA. The Unchained IRA is closest to this structure in this hierarchy.
Self-directed IRA (SDIRA)
A self-directed IRA is a custodial IRA where the custodian allows for expanded investment options outside of or in addition to typical brokerage and bank assets (stocks, bonds, CDs, etc.). Owners of self-directed IRAs can invest in non-traditional assets like real estate, businesses, private loans, tax liens, precious metals, and digital assets. Although the IRS doesn’t have a definitive list of allowed investments, it certainly has a few that are not allowed (collectibles, life insurance, certain derivatives, S-Corps, etc.).
Checkbook IRA
Checkbook IRAs are a subset of self-directed IRAs. The term “checkbook IRA” is not standard, but it usually refers to a self-directed IRA that gives an account owner control of investments through a checking account, usually through an LLC conduit. The account holder can then make investments with IRA funds simply by writing a check (“checkbook control”). With the added freedom of additional investment choices comes added responsibility of administration, as well as legal ambiguity as to whether the structure still qualifies as a tax-exempt IRA.
Non-checkbook self-directed IRA
A subset of self-directed IRA where the custodian approves transactions before investments are made. Investors must wait for the custodian to review each potential investment and formally accept title to the underlying asset. These were commonly used for real estate and private equity investments and began regaining popularity once additional legal uncertainties arose regarding checkbook IRAs in late 2021 (discussed in section 4 below).
Use code: “btcmag” for $100 off Unchained IRA + 1 year free of Bitcoin Magazine Pro market research. Click here to hold the keys to your retirement using Unchained’s collaborative custody financial services.
Risks to watch for when using a self-directed or checkbook IRA
1. Liquidity
Unfortunately, many self-directed assets lack liquidity, making them difficult to sell quickly. Examples include real estate, privately held businesses, precious metals, etc. If cash is ever needed for a distribution or internal expense, selling an asset fast could be a problem (which compounds into other problems, i.e., accidentally commingling funds). Self-directed IRA owners should conduct thorough due diligence on asset liquidity before committing to an investment strategy.
2. Formation and legal structure
When forming a checkbook IRA, a self-directed IRA LLC is established first. Then, the LLC establishes a checking account just like any other business entity. Next, the LLC is funded by sending the IRA funds to the checking account.
With the proper legal structure, the IRA owner can become the sole managing member of the LLC and have signing authority over the checking account. However, improper legal structure, registration, or titling could all cause serious problems for the tax-advantaged status of the IRA. Many checkbook IRA facilitators are competent, but errors could always lead to issues and possible disqualification/loss of the entire IRA.
3. Misreporting transactions
Within a checkbook IRA, owners can fund investments quickly and freely, but this comes with the responsibility of properly following rules and self-reporting transactions.
At the end of each year, the owner of the LLC will need to provide complete transaction details to its IRA custodian and submit fair market valuation (FMV) information. Without oversight into each transaction you make, a custodian is more likely to misreport income on your investments. Always ensure the custodian has accurate information to avoid accidentally breaking the law.
4. “Deemed distribution” treatment
Clients looking to buy precious metals, real estate, or digital assets should know the risk of “deemed distributions” treatment. A recent United States tax court case, McNulty v. Commissioner, illustrates the considerable risks of maintaining a checkbook IRA. In the McNulty case, a taxpayer used her checkbook IRA LLC to purchase gold from a precious metals dealer. She stored the LLC’s gold at home in her personal safe. The court ruled that her “unfettered control” over the LLC’s gold without third party supervision created a deemed taxable distribution from her IRA.
It is impossible to know how far a tax court will go applying “deemed distribution” treatment to any given transaction or investment within a checkbook IRA. For checkbook IRA owners that hold the keys to bitcoin in an unsupervised structure, there is a risk that the McNulty ruling could cause your entire IRA to be subject to tax. Further, since alternative investments were fairly recently (2015) added to IRS Publication 590, it’s entirely possible that the IRS and Congress could apply more scrutiny to checkbook IRAs going forward. Read more about the McNulty case and its implications.
5. Prohibited transactions
All self-directed IRA owners are always prohibited from commingling personal and IRA assets or using any personal funds to improve IRA assets. “Self-dealing” is one of the most common pitfalls for self-directed account holders. For example, if you use your IRA to purchase real estate, you are not allowed to use the property yourself—not even a little bit. You cannot live there, stay there, or rent office space to yourself there. You are not even allowed to make your own repairs or provide “sweat equity.”
It’s not only the IRA owner that can’t participate in any “self-dealing,” but spouses, children, and grandchildren as well. They are considered disqualified individuals, and penalties are stiff. These are stringent rules and can result in huge tax headaches if breached. I don’t intend to crush any dreams, but investing your 401k/IRA into your lakefront Airbnb vacation home and having you or your family stay there even once is a bad idea. No purchasing a rental home and renting it out to family members either. For further fun, see the IRS list of prohibited transactions here.
Here are a few examples of how prohibited transactions rules could be applied to digital asset investors:
Commingling personal wallets with IRA walletsLeverage without a non-recourse loanInvesting in certain collectible NFTs1
6. Financing
Financing within a self-directed IRA is also more complicated for several reasons:
Typically, a non-recourse loan and larger down payment are needed for any property purchases.Unexpected costs and fees can add up quickly and eat into any profits.IRA-owned active businesses could run into the issue of UBIT (Unrelated Business Income Tax). This also affects the overlap of bitcoin mining within an IRA.Any income and expenses must remain within the IRA structure and never commingled with personal funds. For example, when the water heater goes out (real estate) or salaries need to be paid (businesses), the IRA itself must pay for those services out of the IRA’s own cash. IRA owners could be tempted to co-mingle funds temporarily as they look for short-term liquidity to solve their cash needs.
What does this mean for bitcoin IRAs?
The self-directed IRA space has many potential risks if not properly managed. The IRS and Congress have been paying special attention to how these structures are used and abused. Combine this with their interest in regulating digital assets, and the landscape appears ripe for further scrutiny. With that, bitcoin IRAs need a unique approach that mitigates these pitfalls.
Unchained IRA is not a checkbook IRA
If you’re looking to hold actual bitcoin in your IRA account, you should consider the Unchained IRA. It’s not a “checkbook IRA” where transactions must be self-reported, and Unchained uses its key in the collaborative custody setup to track inflows and outflows of IRA vaults. That visibility mechanism allows the custodian to actively monitor the IRA and therefore allows users to remain compliant with current IRA rules and regulations.
There is no self-reporting required, and the non-checkbook structure helps mitigate the risk of potential pitfalls (McNulty, misreporting transactions, etc.). If bitcoin appreciates like many investors hope and expect, holding coins in an IRA structure properly is of the utmost importance.
This article is provided for educational purposes only, and cannot be relied upon as tax advice. Unchained makes no representations regarding the tax consequences of any structure described herein, and all such questions should be directed to an attorney or CPA of your choice. Jessy Gilger was an Unchained employee at the time this post was written, but he now works for Unchained’s affiliate company, Sound Advisory.
1While not technically part of the Prohibited Transaction Rules (section 4975 of the Internal Revenue Code), collectibles are separately prohibited from being held in an IRA under section 408(m).
Originally published on Unchained.com.
Unchained is the official US Collaborative Custody partner of Bitcoin Magazine and an integral sponsor of related content published through Bitcoin Magazine. For more information on services offered, custody products, and the relationship between Unchained and Bitcoin Magazine, please visit our website.
The Technical Architecture of the Quantum Cats
Quantum Cats is a collection of 3333 Ordinals Inscriptions that evolve over time, to reveal different artwork. This is the first ever collection of Inscriptions that will evolve over time, and was created in a time of high fees and an unpredictable future fee market. This is not an article about the aesthetic virtues of the artwork (I think they look cool) or reasons to participate in the market for them; this is an article about the technical implementation of Quantum Cats. I think the engineering challenges we faced and the techniques we implemented to meet those challenges are interesting and potentially useful to both future Ordinals creators and to other Bitcoin application developers generally.
Before getting into the technical nitty gritty of Quantum Cats, it’ll be useful to understand the experience we were trying to create. Ordinals users hold inscriptions (digital collectables that are implemented in the Ordinals protocol and are transferred with Bitcoin transaction) in self-custody Bitcoin wallets that have coin control and transaction construction features that allow for transfer of specific ordinals, as well as the signing of more complex transaction types (such as trustless offers and swaps on ordinals marketplaces). We wanted to create an Inscription collection that would evolve over time – adding or changing attributes or traits of the Cats.
The artwork for Inscriptions is published on-chain in the witness of a Taproot transaction (in a special encoding called an Envelope – ordinals-aware software parse transactions looking for this envelope in order to find inscriptions). That means that any particular inscription data is immutable and can not be changed once it’s been published (short of a re-org). However, there are a couple different ways that we can deliver the experience of changing artwork, even though the artwork never actually changes (and in-fact, having access to the old artwork is great if you like it more!).
Recursion is an ordinals feature where one inscription can reference the content of another. For example, you can inscribe an HTML page, and have it include images that are in other inscriptions. Ordinals software renders HTML pages in iframes, so you can have an ordinal’s content be built-up client side from multiple inscriptions. HTML inscriptions can not include content from the broader web, only from other inscriptions or a small set of other endpoints provided by the ordinals software (for example, there is an endpoint to fetch the current bitcoin block height). This means that recursive inscriptions are all still on-chain, they just are decomposed which allows for composability and re-use of common components. For example all the Quantum Cats with a red background can refer to a single inscription containing the red background, instead of all of them needing to put the same data on-chain.
When one inscription refers to another, it does so by its Inscription ID. An Inscription ID is made up of the Bitcoin transaction ID in which the inscription data is revealed, the letter i and then an output index of the inscription that is created. For example, the inscription 4b31771df21656d2a77e6fa18720a6dd94b04510b9065a7c67250d5c89ad2079i0 is the first inscription created in the bitcoin transaction 4b31771df21656d2a77e6fa18720a6dd94b04510b9065a7c67250d5c89ad2079. That means that if you inscribe an image (like a png) and then inscribe an HTML page that includes the inscription ID of the image in an img tag, you can have the HTML inscription render the content of the image inscription. If the HTML inscription refers to an image inscription that is not actually on-chain (yet), then the ordinals server will return a 404 (not found) error, which the HTML inscription can quietly swallow. If we pre-sign image inscriptions – but don’t broadcast them to the Bitcoin network – we can obtain their future inscription IDs (because they are just a transaction ID and an index), and include those inscription IDs in HTML inscriptions that we do broadcast. When someone views the HTML inscription, it is able to render the content of its references that are on-chain, but will not be able to render the presigned but not broadcasted components. As more components are published, the HTML inscription will automatically be able to render them. This is the core mechanism that the Quantum Cats collection uses to evolve its artwork – presigned transactions for traits that are progressively revealed over time. As we’ll see, fee management and market dynamics introduced complexities that made the Quantum Cats need some additional layers of indirection and features, but presigned transactions with pre-computed transaction IDs are the key feature of Bitcoin that made the collection possible.
Even though the contents of a presigned but unrevealed inscription are unknown before the transaction is broadcast, the same inscription ID will have the same content. This created a problem: even though people can’t tell what a future trait would be (like a background or a body trait), they would be able to count the number of times that a particular inscription ID occurred and be able to tell which future traits were more-or-less rare, and be able to trade Cats on their future evolutions. We really wanted evolutions to be surprising and fun, and not knowing ahead of time what future evolutions would do to the relative rarity of different cats is a lot of fun. So, we introduced a layer of indirection: every cat refers to presigned (but unrevealed) “Layer Connector” that map a Cat by a unique ID to presigned artwork. That means for example that every Cat refers to the same Layer Connector for its initial background image. It is only once this Layer Connector is broadcast to the network that people can learn which backgrounds are more or less common. This technique also allowed for space-savings: since every cat refers to identical layer-connectors, the HTML for the cat to import the layer connectors can be inscribed once and then referred to by each of the 3333 Cat inscriptions. In fact, each Cat inscription was reduced down to 109 bytes: just a unique Cat ID and a script tag to import the logic to fetch and render the common set of Layer Connectors, look up the unique artwork for each layer by cat, and render that artwork. Being able to move the mapping of each Cat to its artwork out of the individual Cat inscriptions and into a common inscription, and adding the layer of presigned indirection not only solved the information leak about relative rarity in traits, but also saved approximately 5 BTC in inscription costs!
With this introduction of Layer-Connector inscriptions and the factoring of rendering logic to a common component, there are now 4 kinds of assets being inscribed:
Actual artwork for each trait in the Cat (a background image, or a body, or the eyes)A layer-connector that maps a Cat by its ID to a specific artwork asset. This mapping happens once per “layer” (background, body, eyes, mouth, etc.)The core dispatch and rendering logic. We call this the “Dispatcher”. It is responsible for fetching a layer connector, looking up the artwork for the Cat in the layer connector, fetching that artwork asset, and then rendering it to a canvas in order. This successive rendering in order is why we model the artwork as a layer. The individual Cat that is distributed to a collector. This is 109 bytes and includes a unique ID and a reference to the dispatcher, which contains all the rendering code
In Quantum Cats, there are several hundred artwork assets, 40 layers (meaning 40 layer-connectors), 1 dispatcher, and 3333 cats. The 3333 Cat inscriptions refer to the inscription ID of the Dispatcher, which refers the the inscription IDs of the 40 layer-connectors, each of which refers to one or more inscription IDs of artwork assets. We presigned these assets in the reverse order: first the artwork to get their inscription IDs, then we rendered those into layer-connectors and presigned those to get their inscription IDs, then rendered the Dispatcher and presigned it, and then finally assembled the individual Cat inscriptions.
Inscription IDs include a Bitcoin transaction ID. Bitcoin Transaction IDs are a function of their inputs, outputs, version, and locktime. That means that if we spend the UTXO that funds a presigned transaction on some other transaction, then we will never be able to re-create that same transaction ID again, and we will break our presigned inscription reference! To avoid this, we created a UTXO to fund every presigned transaction, and then maintained a database to track which UTXO was assigned to fund which presigned transaction. We also had automated sanity checks to assert that no two inscriptions spent the same UTXO, that every inscription commit transaction only spent its assigned UTXO, and that the total inputs and outputs of all transactions (including fees) were what we expected. These checks ran whenever the system touched wallets or keys, and gave us confidence that nothing was being signed that shouldn’t be. Additionally, we used segregated wallets for different asset inscription types, to add further protections against a bug causing a UTXO being double-assigned. We also built a test harness that ran through all of the presigning and publication of inscriptions on regtest and then validated that the data that ended up on-chain matched what was in our control-plane database.
Presigning transactions in this way meant that we had to pre-commit to the fees that each inscription would pay. We can’t know what fee rates will be when we eventually reveal these evolutions, so what we decided to do is presign the transactions with a reasonable fee rate and then build tooling to bump the fees in the future if we presigned too low (if we presigned a fee higher than needed, we would just have to live with it, so part of the analysis here was picking a fee rate we were comfortable with even if it turned out we overpaid). Other than using a transaction accelerator service (paying a miner out of band to include a transaction in a block even if it pays below-market in fees), there are two techniques to increase the effective fee-rate of a transaction: Replace-by-fee (RBF) and Child-Pays-For-Parent (CPFP). RBF involves re-spending the inputs of a transaction in a new transaction that pays a higher fee. Because our application relies on pre-committed transaction IDs, this was not an option. CPFP involves spending the unconfirmed output of a transaction in a new transaction that pays a higher fee than the “parent”. In order for miners to capture the fees from this “child” transaction, they have to include both parent and the child as a package. The effective fee-rate ends up being the total fees paid divided by the total virtual size of the package (all the transactions together). Since the parent transaction is unperturbed, this was exactly the fee-bumping mechanism that we needed.
One remaining wrinkle is that we had potentially hundreds of transactions that would need to be fee-bumped. In addition to the difficulty of accurately bumping 10’s or 100’s of unconfirmed transactions by hand, there are also relay policies that prevent a package of more than 101 KvB (virtual kilobytes) or more than 25 transactions from being relayed through the network. That means that if we needed to CPFP 50 transactions, we’d want to do them all in parallel, rather than serially. To accomplish this, we built tooling that would:
look at a list of unconfirmed transactions and for each one calculate the cost to CPFP-bump that transactions to a target fee rateAggregate those amounts as outputs in a new transaction that spent from a single input to all of the UTXOs needed to bump the target transactions in parallelPrompt the operator to send the total amount of bitcoin required (it calculated fees for the splitting transaction as well) to a single addressOnce the deposit was received, it would broadcast the transaction to split the deposit into one UTXO for each transaction that needed to be bumpedIt would then construct and broadcast CPFP transactions for each of the stuck transactions
We tested this system on Regtest bumping up to 300 transactions at a time. We also had an opportunity to use it when we needed to bump the fees of several layer-connector reveal transactions on mainnet! You can see the “split” transaction here: https://mempool.space/tx/2ec4a8708524faf9901c69da8518b632ec31762730218d3b38ff40954cee882f Each of those outputs funds the CPFP to bump an inscription reveal transaction from 65 to 150 sat/vb.
The art assets made up ~90% of the total data for the project. What we wanted to do was opportunistically publish all or as much of the art as we could when fees were low. But, we also didn’t want to have people see the art before the cats were ready to evolve. So, we decided to encrypt the artwork and then publish the decryption key for the artwork with the layer connector (which contains the mapping needed for a Cat to fetch its trait). This let us decouple the data publication step from the trait reveal. This let us take advantage of a time of lower fees to do the bulk data publication, while still being able to show the world the artwork at a time that made sense for the collection. The mechanics here are straightforward: before presigning artwork assets, all of the artwork for a particular layer (again, think background or eyes or mouth) is encrypted with a per-layer encryption key. That encrypted artwork is used in a presigned inscription as a stream of bytes. Then the encryption key is rendered into the layer connector (which again is presigned). When the dispatcher fetches a layer connector, it reads the mapping of Cat-ID -> art asset, and also the decryption key for that layer. When it fetches the art asset, it gets it as a byte array, and then uses browser cryptography libraries to decrypt the artwork as a png, and then finally writes it to the canvas.
Putting this all together, each Quantum Cat is a small inscription that fetches a common inscription that contains dispatch, decryption, and rendering code. That code fetches as many layer-connectors as are available on-chain (some of them won’t be because they are pre-signed but unbroadcast). It then uses the inscription IDs and decryption keys in these layer connectors to fetch encrypted artwork in other inscriptions, decrypts them, and then renders them to a canvas. When we need to broadcast these presigned inscriptions, we use bulk parallel CPFP transactions to bump them up to the correct fee-rate without having to commit up-front to too-high a fee. The net result of all of this is that users have a Quantum Cat in their wallet that evolves new traits and attributes over time, while still having all of its assets be immutable on Bitcoin.
There are other aspects of the project that we haven’t covered here – how the browser code manages intermittent failures when fetching all these assets, how you handle curation of an evolving collection, how we managed the UTXO creation process for all the presigned assets in the first place (that one’s easy: it’s the same fan-out UTXO splitting code described above for funding the CPFP UTXOs). But I hope you find the above discussion interesting and helpful in either an inscription project or another project involving presigned transactions.
This is a guest post by Rijndael. Opinions expressed are entirely their own and do not necessarily reflect those of BTC Inc or Bitcoin Magazine.
IMF Demands Changes to El Salvador’s Bitcoin law: Report
The International Monetary Fund (IMF) is reportedly demanding changes to El Salvador’s pro-Bitcoin law, hindering the country’s attempts to secure a $1.4 billion credit line as reported by Infobae.
NEW: 🇸🇻 IMF reportedly demands changes to El Salvador’s #Bitcoin law for a $1.4 billion aid.
They’re scared 🙌 pic.twitter.com/vmqhts5dCY
— Bitcoin Magazine (@BitcoinMagazine) April 12, 2024
El Salvador made history in 2021 by adopting Bitcoin as a legal tender under President Nayib Bukele. The country has since bet big on Bitcoin, building Bitcoin reserves, mining BTC, and launching educational initiatives.
However, the IMF has objected to the Bitcoin law amid loan negotiations with El Salvador, which needs financing to pay debts and obligations. Talks have stalled for nearly two years over the IMF’s calls to limit Bitcoin’s scope in the country.
According to the IMF’s communications director Julie Kozack on last Thursday, the “risks” of Bitcoin remain a key issue in discussions with El Salvador. The IMF has previously cited financial integrity and stability concerns about Bitcoin.
The fund’s opposition highlights a clash over the future of money and payments. While Bukele sees Bitcoin as an innovative solution for financial inclusion, the IMF remains wary of its volatility and decentralized nature, and a threat to their dominance.
Irrespective of pressures, El Salvador has stood firm on its Bitcoin commitment. The country has invested over $150 million in BTC reserves, continually buying more. Bukele also pledged to purchase 1 BTC daily.
The IMF loan impasse puts El Salvador in a difficult fiscal position. By tying loan access to changes in El Salvador’s Bitcoin law, the IMF is exerting its influence over poorer nations. But Bukele seems unwilling to back down on Bitcoin, creating an ideological tug-of-war.
El Salvador’s pioneering Bitcoin adoption signaled a shift toward decentralization and self-determination. Demands to roll back the Bitcoin law undermine the country’s monetary sovereignty.
The standoff illustrates the disruptive potential of Bitcoin to reshape global finance. While risky for El Salvador in the near-term, Bukele’s Bitcoin bet could pay long-run dividends.
US dollar modestly higher as sticky inflation persists, yen hits 34-year low
Post Content
A Deep Dive Into Bitcoin Miners’ Strategies During The Halving
The Bitcoin halving event is pivotal in the cryptocurrency world, impacting miners’ strategies and the entire network’s dynamics. As the rewards for mining new blocks are slashed in half, miners must adapt their approaches to maintain profitability and network stability. In this article, we delve into the various strategies employed by Bitcoin miners during the halving event.
What is Bitcoin halving?
The Bitcoin halving occurs approximately every four years, reducing the block rewards miners receive by 50%. This mechanism is programmed into the Bitcoin protocol to control the coin’s supply and maintain its scarcity over time. With each halving, the rate at which new bitcoins are generated slows down, influencing miners’ incentives and behaviors.
Impact on the Bitcoin ecosystem
The strategies employed by Bitcoin miners during the halving event have significant implications for the broader cryptocurrency ecosystem. They influence network security, hash rate distribution, and the overall supply and demand dynamics of Bitcoin.
Strategies employed by Hiveon B2B miner’s clients
Ben Smith, СЕО Immersion BTC
1. What are your thoughts on BTC miner’s strategies during the halving?
As a self-mining farm the halving there is a balance of deploying new capital for new generation units or optimizing the older generation units through third-party firmwares. I have been thinking about this for a year now and I have tried all the major firmware out there. I came to a conclusion. That Hiveon is the best one and shows the best results. I found that I can achieve close to new generation ASIC efficiency by adding their firmware which will keep me profitable post halving.
2. What will be the difficulty of mining after halving?
The biggest difficulty post-halving will be the reduction in daily revenue. The Bitcoin price needs to rise to offset the price of energy and other overhead. I have confidence that the global hashrate will decrease over a short-term post halving which should make the miners that have the ability to stay on more profitable, by adding Hiveon helps me ensure I will be able to keep my units running profitably post-halving.
3. Do you foresee any opportunities for revenue diversification or alternative income streams to offset the impact of reduced block rewards?
I think bitcoiners are some of the most fruitful and out-of-the-box thinkers on Earth. We strive every day to become more efficient and this can mean in power terms or revenue terms. I’ve heard of guys reusing the heat to generate alternative sources of income. I have not taken that step yet but I do see Bitcoin mining being a secondary device for other industries in the future.
4. Forecasting changes in mining economics (costs, profitability, break even point)
Forecasting changes in mining is always difficult due to it being such a young industry. With the ever changing impact of global events on the energy market it creates a challenge for our industry as it is closely tied to the energy market. My hope is that global conflict gets resolved and we can go back to a more stable atmosphere not only for bitcoin mining but for the future of our children and grandchildren.
5. Scenario planning for mining operations based on projected difficulty and Bitcoin price
When planning for Bitcoin mining operations I think in the past it was focused on how much hashrate a person can deploy. Now when planning on building an operation I think you should look at one thing and that’s how efficient can I be. Instead of racing to increase your hashrate if you focus on efficiency then you are preparing yourself for future changes in the industry.
6. Predicting market reactions to the upcoming halving
I think the market reaction to the halving is different than at any other time in Bitcoin history. When you turn on the traditional finance cable networks and you hear them discussing the halving you know we have made it. We have worked hard for mainstream adoption and it is now progressing with the ETFs.
Sascha Grumbach, Founder & CEO Green Mining DAO
1. What are your thoughts on BTC miner’s strategies during the halving?
Bitcoin miners are strategically navigating the halving period by prioritizing the minimization of operational costs, leveraging tools like Hiveon to monitor and optimize their mining operations. In addition to focusing on efficiency, miners are diversifying revenue streams and carefully assessing the cost-to-efficiency ratio when acquiring new mining equipment. Recent surges in Bitcoin prices have prompted many miners to capitalize on selling some of their accumulated Bitcoin for financial gain. However, with anticipation of even higher prices in the future, some miners are adopting a “hodling” strategy, while others are cautious about overleveraging and are considering more frequent selling of Bitcoin to cover expenses, drawing from lessons learned during previous bull runs
2. What will be the difficulty of mining after halving?
After the halving, the difficulty of mining Bitcoin is anticipated to undergo
fluctuations, initially experiencing a potential short drop in hash rate, possibly returning to levels below 70T. However, over the long term, experts project a continued steep increase in difficulty as the network adjusts to changes in miner activity and hash power. This adjustment mechanism ensures the stability and security of the Bitcoin network, but short-term variations are expected due to factors such as fluctuations in mining profitability and changes in the overall network hashrate.
3. Do you foresee any opportunities for revenue diversification or alternative income?
Absolutely, revenue diversification and alternative income streams are not just opportunities but necessities in light of the escalating competition within the mining industry, where pure mining alone may become less profitable over time. Embracing a circular model presents a significant opportunity, wherein byproducts of mining operations, such as excess heat, can be repurposed for innovative products like our Bitcoin Mango. For instance, we’re utilizing excess heat from Bitcoin miners to dry fruits, showcasing the potential for creative solutions that generate additional revenue streams. Furthermore, initiatives like grid build-out and net metering will become increasingly vital as global energy demands rise, offering avenues for monetization and sustainability in parallel with our core mining activities. This holistic approach not only bolsters profitability but also aligns with broader societal and environmental imperatives.
4. Forecasting changes in mining economics (costs, profitability, break even point)
Forecasting changes in mining economics indicates a prolonged period of challenges, as evidenced by the consistently low hash price index for over a year. Moreover, there’s a possibility of further decline in this index, indicating heightened pressure on profitability within the mining sector. Despite these challenges, electricity costs remain the primary expense for miners, underscoring the critical importance of optimizing energy efficiency to maintain competitiveness. Successfully addressing this aspect will be pivotal for miners seeking to navigate the evolving landscape and emerge as winners in the increasingly competitive environment.
5. Planning the expansion and modernization of mining capacities in light of the upcoming halving
In preparation for the upcoming halving and beyond, we are diligently planning the expansion and modernization of our mining capacities. Our strategy involves a steady upgrade of our mining fleet, ensuring that we stay at the forefront of technological advancements in the industry. Additionally, we are actively engaged in projects aimed at consistently increasing our hash rate, irrespective of the halving event. By maintaining a focus on continuous improvement and innovation, we are positioning ourselves to thrive in the evolving landscape of cryptocurrency mining, maximizing our efficiency and competitiveness for long-term success.
6. Predicting market reactions to the upcoming halving
Predicting market reactions to the upcoming halving is a complex endeavor, especially given the current volatility in the market. Unlike previous cycles, where increases before the halving were noticeable but not as drastic, the current environment is experiencing unprecedented shifts. Factors such as the FTX scam potentially hindering a true bull market last cycle and the recent launch of ETFs with significant inflows have introduced new variables that could fundamentally alter market dynamics. This disruption challenges conventional wisdom regarding Bitcoin cycles and its value development as a scarce asset with widespread exposure. With ETFs buying substantial amounts of Bitcoin daily while the production rate dwindles, the math suggests a scenario where sharp price increases become inevitable to maintain equilibrium. In essence, the landscape may be on the brink of significant transformation, defying previous expectations and reshaping the future trajectory of Bitcoin’s market. This discrepancy between ETF demand (currently 900 BTC mined) and diminishing supply (450 mined per day post-halving) underscores the necessity for market participants to closely monitor and adapt to these evolving dynamics to navigate potential opportunities and risks effectively.
Thoughts, and answers from Hiveon CEO James Jewell
Analysis of BTC miner’s strategies during the halving
Efficiency is Key: Miners should ensure their operations are as efficient as possible. This includes using the most updated and energy-efficient hardware, optimizing energy usage, and considering the cost-to-efficiency ratio when investing in new equipment. Our Hiveon Enterprise OS allows operators to maximize all of the above by creating a cohesive site ecosystem.
Strategic Planning: Miners should plan for the long term. This includes anticipating future halving events, market conditions, and technological advances.
Diversifying Revenue Streams: Miners should consider diversifying their income sources to offset potential revenue losses from reduced block rewards. This could include offering mining services or participating in staking protocols.
Adapt and Evaluate: Miners should be prepared to continuously adapt and evaluate their strategies based on current market conditions and network difficulty. This includes being ready to adjust mining activities and timing strategies to optimize rewards.
Collaborative Mining: Joining mining pools to combine computational power can increase the chances of successfully mining blocks and earning a share of the rewards.
Leverage Mining Solutions: Miners should consider leveraging solutions like Hiveon which can help optimize mining processes, improve efficiency, and ultimately, maintain profitability post-halving.
What is the difficulty of mining after halving?
The difficulty of mining Bitcoin, or any other proof-of-work cryptocurrency, is a dynamic parameter that adjusts approximately every two weeks (2016 blocks) to maintain a constant block time, typically around 10 minutes per block. The difficulty adjusts based on the total computational power (hash rate) of the network.
After a halving event, such as the Bitcoin halving which occurs approximately every four years, the block reward for miners is reduced by half. This reduction in block rewards can potentially affect miner profitability, leading to changes in miner behavior and hash rate.
The difficulty adjustment mechanism ensures that blocks continue to be produced at a consistent rate despite fluctuations in hash rate. If a significant number of miners leave the network due to reduced profitability after a halving, the difficulty will adjust downwards to make mining easier and vice versa.
Therefore, the difficulty of mining after a halving event is determined by the interplay of miner participation, hash rate changes, and the dynamic adjustment mechanism built into the protocol. It’s worth noting that while halvings can impact miner profitability and hash rate, the difficulty adjustment mechanism is designed to maintain network security and stability over the long term.
Do you foresee any opportunities for revenue diversification or alternative income streams to offset the impact of reduced block rewards?
Bitcoin miners can explore various strategies to diversify their revenue streams and mitigate the impact of reduced block rewards from halving events. Some potential opportunities for revenue diversification include:
Mining Altcoins: While Bitcoin mining may become less profitable after a halving due to reduced block rewards, miners can shift their computational power to mine alternative cryptocurrencies (altcoins) that may offer more favorable mining economics. Altcoins with lower difficulty levels or emerging consensus mechanisms may present opportunities for miners to generate additional revenue.Mining Pools and Services: Mining pools can offer additional services beyond traditional block mining, such as transaction processing, blockchain analytics, and consultancy services. By diversifying their offerings, mining pools can generate additional revenue streams and attract a broader range of clients, including institutional investors and blockchain projects.Hardware Sales and Leasing: Mining hardware manufacturers can generate revenue by selling or leasing mining equipment to other miners or cryptocurrency enthusiasts. Additionally, they can offer maintenance services, hosting solutions, and consultancy services to support miners in optimizing their mining operations.Staking and Masternodes: Some cryptocurrencies utilize proof-of-stake (PoS) or masternode consensus mechanisms instead of proof-of-work (PoW) mining. Miners can diversify their revenue streams by participating in staking or operating masternodes for these cryptocurrencies, earning rewards in the form of staking rewards or transaction fees.Blockchain Development and Consulting: Miners with expertise in blockchain technology and cryptocurrency mining can offer development and consulting services to blockchain projects, enterprises, and governments. These services may include smart contract development, protocol upgrades, security audits, and regulatory compliance consulting.Cryptocurrency Trading and Investments: Miners can allocate a portion of their mining proceeds to cryptocurrency trading and investments, taking advantage of market opportunities to generate additional returns. However, this approach carries inherent risks and requires careful risk management and investment strategies.Energy Trading and Grid Services: Miners with access to surplus energy resources, such as renewable energy sources or excess capacity from energy-intensive industries, can explore opportunities to monetize their energy assets through energy trading, grid services, and demand response programs.
Overall, miners can diversify their revenue streams by leveraging their expertise, infrastructure, and network resources to tap into various opportunities within the broader cryptocurrency ecosystem and energy markets. Diversification can help miners navigate market fluctuations, regulatory challenges, and technological advancements while maximizing their revenue potential.
Forecasting changes in mining economics (costs, profitability, break even point)Scenario planning for mining operations based on projected difficulty and Bitcoin priceCase study: Adaptation of major mining farms to previous halvingsPlanning the expansion and modernization of mining capacities in light of the upcoming halvingPredicting market reactions to the upcoming halving
Predicting market reactions to events like the halving in cryptocurrencies such as Bitcoin is problematic due to the complexity of various factors influencing market behavior. However, based on historical trends and general market dynamics, we can make some educated guesses about potential reactions:
Price Volatility: Historically, Bitcoin has experienced significant price volatility around halving events. The anticipation leading up to the halving often drives up demand, which can result in a price increase. However, once the event occurs and the expected supply reduction takes effect, there may be a period of price consolidation or even a temporary decline as market participants reassess the new supply-demand dynamics.Speculative Activity: Halving events tend to attract significant speculative activity as traders and investors try to capitalize on price movements. This can exacerbate price volatility, leading to rapid fluctuations in both directions.Market Sentiment: Sentiment plays a crucial role in cryptocurrency markets. Positive sentiment leading up to the halving, driven by expectations of reduced inflation and increased scarcity, can contribute to upward price momentum. Conversely, negative sentiment or concerns about the event’s impact on mining profitability could lead to price declines.Miner Behavior: The halving directly affects Bitcoin miners by reducing their block rewards. Miners may respond to this reduction in revenue by adjusting their operations, potentially leading to changes in hash rate and network security. Any significant shifts in miner behavior could impact market sentiment and price dynamics.Macro-Economic Factors: External factors such as global economic conditions, regulatory developments, and geopolitical events can also influence cryptocurrency markets. While the halving itself is a supply-side event, broader market trends and macro-economic factors can shape investor sentiment and affect price movements.Long-Term Outlook: Despite short-term volatility, many Bitcoin proponents view the halving as a positive development for the cryptocurrency’s long-term value proposition. The reduction in supply inflation reinforces Bitcoin’s scarcity and its potential as a hedge against inflation, which could attract long-term investors and institutional interest.
While it’s challenging to predict the precise market reaction to the upcoming halving, understanding historical trends, market dynamics, and the broader eco-system can provide insights into potential outcomes. However, investors should approach cryptocurrency markets with caution, considering the high volatility and inherent risks involved.
Given the upcoming (or recent) halving event, could you provide insights into how top BTC miners have adjusted their strategies to maintain profitability amidst the reduction in block rewards and increased competition?
Efficiency Improvements: One common strategy for miners facing reduced block rewards is to improve the efficiency of their mining operations. This can involve upgrading to more energy-efficient mining hardware, optimizing mining software, and implementing better cooling solutions to reduce operational costs.Strategic Location and Energy Sourcing: Miners may strategically locate their operations in regions with access to cheap electricity, renewable energy sources, or government subsidies to lower their energy costs. By securing favorable energy contracts or negotiating partnerships with energy providers, miners can mitigate the impact of reduced block rewards on their profitability.Vertical Integration: Some miners vertically integrate their operations by investing in mining hardware manufacturing, hosting facilities, or energy production infrastructure. By controlling various aspects of the mining value chain, miners can optimize costs, increase operational efficiency, and capture additional revenue streams.Diversification of Revenue Streams: As mentioned earlier, miners can diversify their revenue streams by mining alternative cryptocurrencies (altcoins), offering mining-related services, participating in staking or masternode networks, or engaging in cryptocurrency trading and investments.Hedging Strategies: Miners may use financial instruments such as futures contracts, options, or derivatives to hedge against price volatility and revenue fluctuations. By locking in future revenue streams or mitigating downside risks, miners can protect their profitability in the face of uncertain market conditions.Community Engagement and Governance: Engaging with the Bitcoin community and participating in governance processes can help miners stay informed about protocol developments, network upgrades, and potential changes to mining incentives. By actively contributing to the Bitcoin ecosystem and aligning their interests with those of the broader community, miners can secure their long-term profitability and sustainability.
Overall, successful miners adopt a combination of these strategies to adapt to changes in the mining landscape, maintain profitability, and position themselves for long-term success in the evolving cryptocurrency market.
This is a guest post by Keaton Reckard. Opinions expressed are entirely their own and do not necessarily reflect those of BTC Inc or Bitcoin Magazine.