EIP-1559: Fee market change for ETH 1.0 chain

I think this statement mis-frames the inclusion criteria. Here’s a mirror image:

I still fail to see how "just basefee plus some small tip" would translate to lower net-wide fee variance (when comparing periods of low/high demand for gas). And, more importantly, that average basefee+tip would ever be lower than the current average fee. As @fubuloubu wrote:

(I’ve heard this being mentioned at the rumour mill, that “EIP-1559 issue 1559 === cheaper transactions”; not in this thread yet.)


The trade-off for the miner is currently between block_reward+fees and ommer_block_reward.

In the proposed scheme, it would be between block_reward+tips and ommer_block_reward.

Miners include transactions in blocks based on the marginal profit increase they would get, offset by the chance that the block would become an ommer. Currently, fees rarely make 5% of base block reward. (This was <3% before the most recent issuance reduction; sorry I can’t link a chart: the numbers are from anecdotal personal observation.)

Let’s generously assume that the average fees per block are, in fact, 5% of the base block reward (that is, they are 0.1 ETH and 2.0 ETH, respectively). So, miners have to weigh this 5% extra gain at probability P against a 12.5% loss (at best! 25% for 2-block-old ommer) at probability 1-P. The profitability condition is:

0.05 * P >  0.125 * (1 - P)
       P >~ 0.714
0.05 * P >  0.25 * (1 - P)
       P >~ 0.833
...
   F * P > U * (1 - P)  # F: extra profit percentage when getting Fees
       P > U / (F + U)  # U: loss percentage due to becoming Uncle

Back to what I started with: I fail to see how lowering user-born fees (and by necessity lowering F, since F is the tips part in basefees+tips, and ostensibly basefees+tips < fees…) will result in anything but the miners becoming reluctant to include transactions.

It’s reasonable to expect the opposite: average basefees+tips under this proposal will be higher than average current fees.

(There! It’s out of the way now.)


The scheme further breaks down in times of “congestion”; that being a descriptor for increased average uncle rate: an ever-larger proportion of hashing power contributing to “security of the chain”, but not (i.e.: at the expense of) transaction throughput.

During congestion, when relatively many ommer solutions are available, miners:

  • routinely have the option to include ommer references in their blocks - because, hey, they keep popping up!; and
  • are increasingly pressed to drop transactions altogether - because “no compute” is much faster to process than the “much compute” blocks from the competition (ones that would do well to get out of the way and become ommers…).

Transactions are now in competition with uncles; in particular, the highly-priced transactions (that tend to catch the eye and prompt “congestion!.. congestion!..”). If they can’t “outbid” the inclusion of an uncle (adjusted for the risk of the block itself becoming an ommer!), then it is not rational for miners to include transactions instead of ommers.

This is not specific to the proposed scheme, but is also the state of affairs currently (and manifested mildly in late 2017 and early 2018, I believe).

As noted in the github issue 1559 summary:

If a transaction sender highly values urgency during conditions of congestion, they are free to instead set a much higher premium and effectively bid in the traditional first-price-auction style.

In other words, during congestion, the proposed scheme devolves (at best!) to the current auction mechanism. All good here, except the “at best” part: see previous section.


Somewhat OT: I am against lowering user-born transaction fees wholesale, until a “state fees” portion is introduced to them.

Any scheme that claims to lower user-born fees needs to consider that state expansion gets this much faster when user-born fees are that much lower. The cost is now born by someone else than the users, but they are not gone.

That said, I do appreciate burning a portion of ether, as a means to balance (counter-act) newly-minted ether.


Finally: having both a gasprice_tip (per-gas-unit, in shannons) and cap (per-whole-tx, in ether), where

  • gasprice_tip * gas_limit <= cap;
  • basefee <= gasprice_tip * gas_limit;
  • basefee <= cap.

… will be a nightmare for UX; and will most likely devolve into basefee <= cap as far as wallet interfaces are concerned, with gasprice_tip abstracted away as a percentage of cap

1 Like

If the transactions included end up optimizing for the “tip” added to a transaction, then how have we not re-created the existing fee market, but worse because of this burning requirement?

Currently, a miner has two reasons NOT to include a transaction:

  1. Opportunity cost (ie. other transactions)
  2. Increased uncle rate risk

(2) is predictable, (1) is unpredictable. This proposal brings (1) out of the equation, so we should expect miners to be satisfied by a constant level of tip even as basefee fluctuates.

will result in anything but the miners becoming reluctant to include transactions.

We can actually calculate the size of this the marginal uncle rate risk (ie. increase in change your block becomes an uncle from including an additional 1 million gas in a block).

The thing we would do before is run a linear regression of gas usage vs uncle rate data here to see what the marginal risk of being uncled is if you add more than 1 million gas. Unfortunately, this is temporarily complicated because we recently upgraded network block propagation, and there’s still relatively little data post-change. So what we can do is use the formula U = (k1 + k2 * G) / T where U is uncle rate, T is block time and G is gas usage of a block (see Decker and Wattenhofer 2013 for some of the arguments that lead to this). We have a natural experiment in Constantinople bringing T down, so we can use block, uncle and gas usage before and after the fork. Before we had 220 uncles and 4200 blocks per day, so U_pre = 0.0523, after we have 500 uncles and 6400 blocks per day, so U_post = 0.078. Block time = 86400 / blocks per day, so we get T_pre = 20.57 and T_post = 13.5. G_pre = 7.5m, G_post = 6.2m (see here). So we get:

0.078 = (k1 + k2 * 6.2m) / 13.5
0.0523 = (k1 + k2 * 7.5m) / 20.57

Solving the linear system gives:

k1 = 0.944
k2 = 0.0175 * 10**-6

To get an incremental uncle rate per million gas, you would need to divide k2 by the block time (13.5) so you get 0.0013 per million gas. Another way to see how k2 is tiny is that a 2/3x reduction in block time leads to a 3/2x increase in uncle rate, showing that “base” (ie. not usage-dependent) uncle rate by itself explains almost all of the uncle rate.

If you enter x=AVG_BLOCK_UTIL and y=UNCLE_RATE in etherchain’s correlation utility and set the date to Jan 16 (first Constantinople attempt, which is what forced everyone to update), you can see the line doesn’t slope up quickly at all.

Note that as an absolute upper limit, marginal uncle rate risk is 0.01 per million gas, because that’s what it would be if all of the current uncle rate were explainable by the k2 * G term. So we have two estimates for marginal uncle rate risk: 0.0013 per million (aggressive) and 0.01 per million (maximally conservative).

We can compute the expected loss to a miner of including a transaction via U' * (R_B - R_U) (marginal uncle rate risk * block reward minus uncle reward). The average uncle reward is ~1.67 (note that this means that most uncles are getting the full 7/8 reward; a year ago this was not true, which is part of why a year ago many miners were not including blocks). So R_B = 2, R_U = 1.67 and U' = 0.0013 per million gas, so expected loss per million gas is 0.000433 ETH (or 0.433 gwei per gas). Under conservative estimates this goes up to 3.33 gwei per gas.

TLDR: the disincentive that miners have against including txs is currently already quite low, somewhere between 0.4 and 3.3 gwei, which is only a small portion of transaction fees seen today.

2 Likes

This actually depends on how long you’re willing to wait. If you absolutely need a tx included in the next block, then yes in sudden demand spikes it can temporarily degrade to a first price auction, but if you’re willing to wait ~10 blocks, then it works like an ascending price auction (ie. it’s efficient).

1 Like

Quick note: ideally, we’d run that with gas usage from the uncle blocks, not the “canonical” blocks that included the uncles. Etherchain’s correlations page provides the latter.

This works with the assumption that blocks become uncles not because of the particular transaction inclusion strategy used to construct them, but because of other factors, which can be summed up as “random luck” (the worst there is :smile:).

It’s probably hard to prove one way or another with real historic ommers (which are not guaranteed to be available), so IMO this is an OK assumption to roll with (until proven otherwise, e.g. by intentionally seeking out missing ommers over devp2p ASAP, and comparing them to the nephews as classes).


That is 0.4 to 3.3 gwei per gas, and it is not insignificant. (In fact, this is the gas price range I most often use…)

Gitcoin’s gas history breakdown shows “gas price needed to confirm within 3 hours” to fall most of the time within that range, and “gas price need to confirm within 1 minute” skipping off of the 3.5 mark (or so).

Even using Etherscan’s average gas price chart (which shows y-day’s as 13.5 shannon), the difference is only around one order of magnitude. This, I feel, is not a good chart/metric to use; what is needed here is a breakdown by gas price ranges over time; something like Johoe’s Bitcoin Mempool charts, but for Ethereum main-line blocks. Then we could see how much of the gas used falls into the 0.4-3.3 gas price range (or, more prudently, recalculate this gas price range for each time period in question), where miners are taking an ever-more-guaranteed risk, “out of the kindness of their heart”.


P.S. I’ve gotten OT, I think. Sorry for that.

Oh I agree it’s not insignificant! But what you need to keep in mind is that, in an economic sense, the BASEFEE and the TIP are NOT additive. To see why, imagine that following two facts are true:

  • The fee level at which there is 8 million gas of demand per block is, in some particular span of time, 5 gwei
  • Miners value the cost of including 1 gas worth of transactions at 1 gwei

Clearly, no miner will accept a transaction with a tip of less than 1 gwei, and let’s say there’s a bit of monopoly power added on top so miners on average only accept transactions with tips of 1.5 gwei. Suppose the BASEFEE starts off at 1 gwei. Users would have to pay only 2.5 gwei total, so demand is higher than 8 million; hence, blocks are more than 50% full, and the BASEFEE rises. Eventually, when the BASEFEE reaches 3.5 gwei (NOT 5 gwei!), users have to pay 5 gwei total, and at that point demand is equal to 8 million, so the BASEFEE reaches an equilibrium.

The one special case is the case where the equilibrium fee is less than the tip that miners demand. In this case, the BASEFEE would fall to zero and blocks would just be less than 50% full until demand rises again. But notice that in this situation, the current system would have the same consequence.

2 Likes

Hey Vitalik - will you or @econoar be able to present this at the Berlin Istanbul / 1.x meeting the 17th and 18th? Trying to get either of you confirmed if you will be presenting remote.

I will unfortunately have to present remotely; it’s too close to EthCapetown for me to fly in personally. But will be very happy to do be around on call for as long as needed.

3 Likes

Great, remote is fine. Is this the best place to work out logistics? or I can use email / telegram / whatever works.

We will need to know your time zone to schedule you at a reasonable time.

2 Likes

I love this proposal! I am a non-developer who has performed several thousand transactions in the last few months, and paid gas prices ranging from 1 to 40 Gwei via metamask. The current system actually works fairly well for me, since I have become an expert in picking the correct gas price, but I would like to not spend so much time gaming the gas prices.

Rather than adjusting the MINFEE by 1/8 in each block, is there a reason we can’t adjust it once a day to remove fluctuation even more? I would like to see a target of 50 Billion gas per day rather than 8 million per block, and triple the max size of each block to 24 million gas (uncles allowing). That way if a big ICO or surge of spam hits, we could see 100 Billion in gas for 24 hours, and then MINFEE increases once every 6200 blocks (like Bitcoin-style difficulty adjustments) until the daily gas usage falls below the target…at which point it slowly starts coming back down. This would have the nice effect of doubling current transaction capacity during busy periods while keeping the state size growth where it is now.

If anyone has a super urgent transaction, they could pay 2x MINFEE to (nearly) guarantee it in the next block. Metamask could then simplify the choice of fee to Normal and Fast (2x).

2 Likes

Exciting research overall and the big picture is crystal clear, but from what I understand the choice of the actual update rule for dynamic pricing is not justified very well. For this specific proposed formula, I have an attack in which a small number of users with only 10% of transaction volume can halve the price in less than 1000 blocks only by a particular timing for broadcasting their transactions.






Replied on the Zcash thread, copying my replies here:

I think I see the flaw in your attack strategy. Your attack basically says that the attacker (with 10% of all transactions) concentrates their transactions in every tenth block, so nine blocks out of ten, the attacker’s share is empty and the block size is 90% of the target, and the tenth time the attacker sends all their transactions so the block size is 190% of the target. This means that gas usage adjusts down by 1.25% nine blocks out of ten and adjusts up 11.25% the tenth time, but because of (x-a)*(x+a) < x**2 inequalities, this leads to a slight decrease on average (0.9875 ** 9 * 1.1125 ~= 0.9934 per 10-block period). Applying this 100 times, we get the observed 50% decrease. So it is true that it’s possible to send the same transaction volume with slightly different consequences to demand, but this ignores the fact that demand adjusts with fees, and once the fee drops a few percent others will come in and the equilibrium re-established. All that this means is that in equilibrium average usage will be about 0.5% higher than the target; not a big deal.

That said, if we want to eliminate this issue entirely, we can do that by replacing minfee *= (1 + excess * adjSpeed) with minfee *= e ** (excess * adjSpeed). Doing exponentiation in consensus is hard, but if we apply a quadratic Taylor approximation e ** x = 1 + x + x**2 / 2, so we do minfee *= (1 + excess * adjSpeed + (excess * adjSpeed) ** 2 / 2) , then the discrepancy drops from 1 - 0.9934 to 1 - 0.99978. I’m not sure if it’s worth the extra complexity, I think either way works fine.

Users are paying for the burden which the previous users put on the network rather than their own

This is intentional for incentive compatibility. It’s a common technique in game theory for the price that a user pays to be explicitly only based on other users’ activity, so that incentive-compatibility for the user can be proven trivially as a simple matter of “if the price is favorable the user gets it, otherwise the user does not”. Otherwise, a user’s “real price” is higher than the “set price” because a user has to take into account the fact that their own transaction pushes up the real price for themselves.

2 Likes

Replied on the Zcash thread, copying my replies here:

Otherwise, a user’s “real price” is higher than the “set price” because a user has to take into account the fact that their own transaction pushes up the real price for themselves.

Yes, I agree with your observation. However, I think this can actually be a good point and should be exploited because it incentivizes a user to avoid executing a lot of transactions in a single block. This is similar to the problem of liquifying a large volume of some asset in game theory, in which whenever you sell some amount the price goes down, and as a result, you are incentivized to sell little by little. I think the same mechanism design helps to avoid network congestion.

To be more specific, I have already mentioned the paper “Optimal Execution of Portfolio Transactions” in which the authors prove that if we have a very similar additive update formula, the optimal strategy is to execute transactions over a long period of time. Maybe we can use the same formula in here.

That said, if we want to eliminate this issue entirely, we can do that by replacing minfee *= (1 + excess * adjSpeed) with minfee *= e ** (excess * adjSpeed).

If I rewrite your formula as log(minfee) += excess * adjSpeed your alternative suggestion is just to have the same additive update rule and then exponentiate it to get minfee. As I said in the previous comment the additive framework is well-known in economics but why the exponential? In other words, suppose that minfee += excess * adjSpeed, and then we charge everyone a fee of f(minfee). f could be the identity function, an affine mapping, or any other positive increasing function including the exponential function, but I am not sure why exponential seems like a canonical choice to you?

This is intentional for incentive compatibility. It’s a common technique in game theory for the price that a user pays to be explicitly only based on other users’ activity, so that incentive-compatibility for the user can be proven trivially as a simple matter of “if the price is favorable the user gets it, otherwise the user does not”.

Even in the new settings corrected by the exponential or additive formula, I still believe that users should also be charged for their own transactions as well as the aggregated previous usage. Today, I was reading about the gas token and any such smart contract exploiting storage refund in Ethereum can be used for hoarding which is immediately profitable if the gas price you pay is independent of the gas cost of your smart contract.

What is to stop miners constantly reducing MINFEE to 0, at which point we’re back to a pure auction-based system? The transaction pool cannot be considered a valid measure of network capacity (given that it’s variable across nodes) so it would have to come down to block capacity. Why wouldn’t miners mine empty blocks until MINFEE is 0, then only fill blocks half-full to keep it there?

2 Likes

I have tried to run a simple model to see how MIN_FEE would look like for the existing history of blocks and transactions. Assuming that TARGET_GAS_USED is equal to the half of the block’s gas limit. The first thing I noticed is that formula for MIN_FEE gets stuck at 0 if it ever gets there. Any suggestions?

1 Like

Vitalik suggested introducing a minimum in the MIN_FEE formula, like 1000 wei. It worked, until the around the block 4.13m, after which the MIN_FEE started to run away into infinity, and by the block 4.7m the numbers were 1 console page long :slight_smile:

EDIT: I know that the clamping probably needs to be applied, but how exactly do it? Also, I agree with other comments that the determination of MIN_FEE needs to be driven by the protocol itself and not “voted” up and down by the miners.

1 Like

Here is the Vitalik’s presentation on the matter: https://youtu.be/HaT-BIzWSew?t=4708
And here is my proposed amendment (I am planning to put it in text quite soon), just after Vitalik’s presentation: https://youtu.be/HaT-BIzWSew?t=6601

1 Like

Because with an exponential function the rate of change in percentage terms does not depend on the current fee level. This seems like a natural property to have, especially since it means that the fee can adjust to unexpectedly huge spikes in logarithmic time rather than linear time. If we don’t care about this, then a linearly adjusted min fee (eg. max 0.25 gwei adjustment per block) could work too.

Vitalik suggested introducing a minimum in the MIN_FEE formula, like 1000 wei. It worked, until the around the block 4.13m, after which the MIN_FEE started to run away into infinity, and by the block 4.7m the numbers were 1 console page long :slight_smile:

So this is why you can’t test the min fee formula without actually using it on a blockchain. The feedback mechanism critically depends on the fact that if the fee shoots up demand decreases.

However, I think this can actually be a good point and should be exploited because it incentivizes a user to avoid executing a lot of transactions in a single block.

If the total volume in one block ends up exceeding the gas limit, then it already becomes more expensive for you, because you have to outbid other people’s fee premiums. If it does not, then as I mentioned in my presentation, there’s basically negligible social harm from having one block with 12 million gas followed by a block with 4 million gas compared to two blocks with 8 million gas, so what’s the problem?

2 Likes

I think the problem is that the people in the next block are paying much more because of your transactions while outbidding the others in the current block should not be that much costly.

@mcdee, an excellent observation. The situation is worse than what you described as miners do not have to fill blocks half-full to keep it there. They can fill them up, and once the price is zero, it always remains zero. I think that there should be no incentive for miners to roll back into the auction-based system. My proposal is a sliding median price auction for the premiums to avoid overpayments instead of the first price auction. You can read more details in here.

Continuing my previous discussion about the choice of the update function f, the additive formula does not have this problem as the price can always recover from zero.

My proposal: Set a minimum value of at least 1 wei to MINFEE, PARENT_MINFEE, and block.gas_used.

Using these assumptions:

delta = block.gas_used - TARGET_GASUSED
TARGET_GASUSED = 8,000,000
MINFEE_MAX_CHANGE_DENOMINATOR = 8

There is also way to simplify the MINFEE formula:

MINFEE = PARENT_MINFEE + PARENT_MINFEE * delta // TARGET_GASUSED // MINFEE_MAX_CHANGE_DENOMINATOR

For the rest of this comment, I will be using abbreviated variable names:

MF = MINFEE
MF_P = PARENT_MINFEE
MF_MCD = MINFEE_MAX_CHANGE_DENOMINATOR
GU_B = block.gas_used
GU_T = TARGET_GASUSED
delta = GU_B - GU_T

Note: I’m also using / for division instead of //

  1. Substitute original formula with new variable names
MF = MF_P + MF_P * delta / GU_T / MF_MCD
  1. Separate MF_P
MF = MF_P * (1 + delta / GU_T / MF_MCD)
  1. Instead of dividing by MF_MCD, multiply by the reciprocal
MF = MF_P * (1 + (delta / GU_T) * (1 / MF_MCD))
  1. Substitute in the variables for delta
MF = MF_P * (1 + ((GU_B - GU_T) / GU_T) * (1 / MF_MCD))
  1. Separate the delta variables from the numerator and divide both delta variables
    by the same denominator
MF = MF_P * (1 + ((GU_B / GU_T) - (GU_T / GU_T)) * (1 / MF_MCD))
  1. GU_T / GU_T == 1, so replace GU_T / GU_T with 1
MF = MF_P * (1 + ((GU_B / GU_T) - 1) * (1 / MF_MCD))
  1. Multiply 1 / MF_MCD with both GU_B / GU_T and -1
MF = MF_P * (1 + (GU_B / (GU_T * MF_MCD)) - (1 / MF_MCD))
  1. Replace the first 1 with MF_MCD / MF_MCD
MF = MF_P * ((MF_MCD / MF_MCD) + (GU_B / (GU_T * MF_MCD)) - (1 / MF_MCD))
  1. Since MF_MCD / MF_MCD and 1 / MF_MCD have the same denominator, we can combine the numerators
MF = MF_P * (((MF_MCD - 1) / MF_MCD) + (GU_B / (GU_T * MF_MCD)))
  1. Replace MF_MCD and GU_T with their static values
MF = MF_P * (((8 - 1) / 8) + (GU_B / (8,000,000 * 8)))
  1. Simplify
MF = MF_P * (7 / 8) * (GU_B / 64,000,000)
MF = MF_P * ((7 * GU_B) / 512,000,000)

or

MF = MF_P * ((7 * GU_B) / (2^9 * 10^6))

My conclusion:

  • MINFEE, PARENT_MINFEE, and block.gas_used can never be equal to 0.
  • If MINFEE, PARENT_MINFEE, or block.gas_used equal 0, all three variables
    will equal 0.

Additional question:

  • Can a bad actor to intentionally mine a block with 0 gas used?