@vethereum@JustinZal While the BLS precompile would make verifying data relating to the consensus layer easier, my team and I at Succinct have actually been able to get around this by verifying BLS signatures inside a zkSNARK and then verifying that proof on-chain.
Inside our smart contract, we have a zkBLSVerify function which can be called to emulate the BLS precompile. The tradeoff is that a proof must be generated off-chain (< 2 minutes). The function below is taken from our smart contract (code here), so it’s specific to our use case, but it can very easily be generalized to any other application that requires the precompile.
If the BLS precompile is a blocker for an exciting application you have in mind, please let us know and we would be happy to help out (just email us at hello@succinct.xyz).
Agreed with @jtguibas. Without a precompile contract for BLS curve, verifying the signature can be difficult. By the way, when we talk about BLS signature, it is BLS12-381 BLS signatures right? (The two BLS are different).
I was wondering if there is a motivation for not including compression/decompression in the EIP? Is it because providing uncompressed elements is cheaper at 16 gas/byte?
In that case, it may be that we need to reconsider this as for rollups this ratio will be different and data will be much more expensive as compared to computation. Since the desire of many rollups is to stay as close as possible to Ethereum L1, maybe we should add precompiles for decompression?
The reason to not include decompression is indeed the cost (on L1 ). There was already large scope in this EIP and so G1 and G2 decompressions were skipped.
I understand there are no champions for the eip, but the situation is really bad. The operations are already present in consensus-layer clients, running on every node.
Right now, all zk apps are using bn254, which has terrible security level - much less than 128 bits.
At some point it would be, or already is - possible to forge proofs for such protocols.
Developing zk apps takes longer when compared to ordinary contracts, which means the sooner this is deployed, the better. Apps, especially ones without ownership, would not be able to easily upgrade from bn.
Oh, sad. It seems another chicken-and-egg problem for requesting a precompile!
It seems a pattern that whenever a precompile is proposed, there will be a lot of debate about whether there is likely to be a lot of usage. The chicken-and-egg nature is that without deploying the precompile, we can’t calculate the adoption, and without the adoption, it’s hard to argue for making it a precompile and thus lack of interest
Good news, we are propose to have a “progressive path towards precompiles”, where author proposes a precompile, but instead of requiring it to be deployed as precompile first, they deploy a precompile candidate which is a traditional smart contract (with higher gas cost per operation, though), and see if there is usage. Once the usage is validated, the data can be used to (1) justify the need for such precompile, and (2) serve as call pattern to help client implementor tune performance
I can see that the EIP needs a champion to take it through, hate to ask the obvious but do any of the original EIP authors intend to take on that role (courtesy ping to @shamatar@ralexstokes), failing that, could we have a long-standing community member take on that role (maybe @axic@matt@maxsam4), failing that, I would be happy to take on that role.
hi @QEDK ! I intend to push for inclusion of this EIP in the next hard fork after dencun
the core dev process hasn’t really focused on planning this next hard fork yet so there hasn’t been much activity as we all focus on the deneb/cancun EIPs
I just Implemented a KZG poly-commit to replace a Merkle Tree system though the interpolation, commitment, and proof generation were done in Rust, coming to write the verification contract now and just seeing this precompile has not been included yet