Jello Paper as Canonical EVM Spec

Thanks @ehildenb To be clear, @grosu and @expede, I’m a big fan of K-EVM and the Jello Paper. I’m also a big fan of the work to maintain and improve the Yellow Paper. But I’m becoming convinced that given our anarchy there is nobody to declare one spec normative, only community consensus that certain specs are useful. And given the computational consensus network we are building there is no need for a normative spec; the network itself is normative. To mangle Heinlein, “The blockchain is a harsh mistress.”

Hmm, I’m not sure why LOC would matter here. @gcolvin am I missing something here? IMO a spec should be complete and clear, without regard for length (either short or long)

Anyhow, here’s some information in a table

kLOC Language Style Executable Complete Days Since Update
Yellow ~1.3 LaTeX Prose & Equations :x: Mostly 37
Beige 1.3 LaTeX Prose :x: :x: 249
Jello 2.4 K Declarative :white_check_mark: :white_check_mark: 7
Trinity 2.9 Python Imperative :white_check_mark: :white_check_mark: 0
Aleth 2.4 C++ Imperative :white_check_mark: :white_check_mark: 0

To be clear, @grosu and @expede, I’m a big fan of K-EVM and the Jello Paper.

:tada::tada::tada: :mage:

there is no need for a normative spec

:thinking: Thinking out loud

Interesting. Shouldn’t the semantics of the execution engine be strongly guaranteed by the platform, lest a network split occur from differing state? AFAICT Ethereum is intended to have a logically centralized VM. Such an event would get healed, and is absolutely possible today, but also would cause some general thrash in the system (uncle rate), which should be best avoided, no?

TL;DR an impassioned defense of the Jello Paper’s notation

I’m trying to balance facilitating discussion and being opinionated :sweat_smile: Take the following with a grain of salt, and opposing viewpoints are very strongly encouraged!

While almost everyone that I speak to is immediately positive about moving to the JP, I feel like there’s two points of resistance:

  1. No one seems to love the YP’s notation as-is, but some have already invested in learning it
  2. Don’t want to invest in learning a new notation / assume K is for math geniuses only

K doesn’t read like the PLT literature (aside from BNF). As such, there’s no special advantage on first reading of the JP vs YP, even for people with such a background (omitting those who already know K). From an informal sample, K appears to be much easier to grok for programmers than judgements or the lambda calculus. Further, being that it’s executable (and formalized), the JP has no option but to be extremely clear about all details, which is what you want out of a spec.

The YP’s notation also takes some getting used to. I personally find its syntax to be unnecessarily dense (ex. “which μ subscript is which again?”). Also some details are unclear, so I found myself repeatedly referring to concrete implementations (mainly Parity and Trinity) while writing my implementation based on the YP. It’s a a small sample, but I hear this from others as well. Admittedly this could be corrected in the notation and prose, but this paper is almost 5 years old and still reads this way :woman_shrugging:

2 Likes

I’m just going to vaguely gesture at this thread as more reason to have a formally verified spec. Explicit invariant are also very important. We can go pretty far enshrining invariants in the system

I personally would love if the community would take a first-principles approach on Ethereum, with only explicit invariants. The current system is loose enough to make spec change conversations difficult.

4 Likes

I was starting to think about semantic versioning.

One notion that I arrived at is that the protocol should itself have semantic version, and that its component parts, i.e. the EVM, consensus, RLP, JSON-RPC, each should have semantic versions. Within their various contexts, the versioning changes would have different meaning.

Ethereum protocol would increment MAJOR when the update to any of its components leads to a fork.

EVM would increment MAJOR only if the update leads to incompatibility or security vulnerability (given contract development norms). MAJOR version increments in EVM is similar to a new series in microprocessors. But MINOR version increments in the EVM would lead to a MAJOR increment at the protocol level.

2 Likes

:100::100::100: I very much want semantic versioning!

  1. It’s currently difficult to track the various names and EIPs that should be applied in a certain order
  2. Clearer for users (“oh, this implements Ethereum 4.0.0, so something is not backwards compatible”)
  3. Distinctions between bug fixes and semantic changes generally useful
2 Likes

@expede I simply found LOC interesting. First because I was surprised that the C++ version was no longer than the others, despite being a low-level language bummed for performance. It was shorter when I started work on it, and I’d have expected Python to be shorter still. And second because the K version was no longer than the rest. Formal specs used to be a lot longer than the code they formalized.

You talk to different people than I do. In that thread of core dev discussion I shared I think only Everett was unconditionally positive about dropping the Yellow Paper.

This is an old discussion, with a chapter on it in The Mythical Man Month. We are in the usual bind. Most all engineers and mathematicians can read English and ordinary math. They cannot, without prior study, read formal notations like K. And they usually haven’t time or desire for such study. (One way out is an English spec and a formal appendix. Another is to make the spec a tutorial on the formal notation.)

Regardless, as long as people want to maintain and improve the Yellow Paper there will be one. And if people want to complete a Jello Paper there will be one. And keeping them in sync with each other, the various clients, and the network itself will be a pain. It’s pain with only one spec too.

1 Like

Until last year the copyright prevented anyone but Gav from changing it. Yoichi put a lot of work into it, (and on formalizing the EVM) but then decided to stop working on crypto. So only recently are others taking up the task.

Thanks for taking the time to explore this! :tada::tada::tada:


Copyright Issues

Ah, excellent context — thanks! That’s a pretty horrendous situation for a community project to be in :scream: The copyright for such a foundational component of this project absolutely should have been released much earlier, but that’s neither here nor there.

It sounds like there’s a solve now, though, so that’s great :+1:


Unnamed Sources

My original source was the controversial semi(?) closed discussion in Prague, where it was brought up by someone else (I can’t remember who for the life of me). The Twitter Thread has a number of supporters, as well as some of the folks in this thread (including you :smiley: :man_mage: :tada:).

I agree that there is no strong consensus on this change!

I really enjoyed the IETF’s “rough consensus and running code” concept that came up at the EM Council of Prague. In the Jello Paper we have a more complete & working spec, and support from a number of people in the know. The question becomes if this is a “over my dead body”, and why if so?


The Yellow Paper is Dead, Long Live the Yellow Paper

Wonderful! Let people use whichever notation they like, and let a thousand flowers bloom! But even then, why not make the canonical spec for EIPs the formally verified executable one? People can still translate concepts into other notations, or have then generated directly by the K Framework. This was the intent of creating the Beige Paper by hand (since the Yellow Paper was too obtuse for most people), but the canonical spec was still assumed to be the YP.

One such setup could look like this:

                     Jello Paper
                   (canonical spec)
                         |
                         |
      +------------------+---------------------+
      |                  |                     |
      |                  |                     |
Yellow Paper         Beige Paper       BNF & Proof Trees
(historical)        (accessible)        (mathematicians)

The Comprehensibility of K

I actually strongly disagree on needing formal training to read K :open_mouth:

I’m not sure that I’d call K a “formal notation”. It is very much a programming language, and generates both executable code and documentation (conceptually in whatever notation you would like). This reads in a way not unlike modern Javascript’s arrow functions with destructuring, and consuming a K-based spec requires no additional study in PLT or specialized mathematical knowledge than any other programming language. Presumably you would like the writers of the spec in any notation to have some relevant background, though they may or may not be experts in K (I certainly am not one!)

For reference on how much more familiar K is than operational semantics, see the illustrative examples below

Example of K (from Jello Paper)

    syntax UnStackOp ::= "POP"
    rule <k> POP W => . ... </k>

    syntax StackOp ::= DUP ( Int ) | SWAP ( Int )
    rule <k> DUP(N)  WS:WordStack => #setStack ((WS [ N -Int 1 ]) : WS)                      ... </k>
    rule <k> SWAP(N) (W0 : WS)    => #setStack ((WS [ N -Int 1 ]) : (WS [ N -Int 1 := W0 ])) ... </k>

    syntax PushOp ::= PUSH ( Int , Int )
    rule <k> PUSH(_, W) => W ~> #push ... </k>

Formal Notation

This is what a common formal notation looks like:


Call For “Over My Dead Bodies” :skull_and_crossbones:

Correct me if I’m wrong, but you [Greg] are not an “over my dead body”. I would love to hear directly from people that are dead set against this idea, and why that’s the case for them! If you could point the people that you’re talking to at this thread, that would be very helpful!

2 Likes

The questions I keep bumping into are, “Who has the authority to make it canonical?” and, “Do we really need one spec to be canonical?”

I’d say the current consensus is that the Yellow Paper specifies the Ethereum protocol, and that where it doesn’t that is a defect that needs fixing. No other specification that I know of is complete enough to even attempt to do that job. When there are other specs that can then we can ask about their relative status.

Ok, but I can follow the VM part of the Yellow Paper well enough to write code, and can just barely follow the KEVM. And yes, the notation you quote is unreadable too (though it’s not the same as the K), but the math in the Yellow Paper is much easier than your example, and could be made even easier.

It looks to me like different specs are useful in different ways to different people (and programs). Different enough that there are people who can use some but cannot use the others.

2 Likes

To continue the semver sub-thread here, @axic pointed out EIP-178 for versioning the EVM:

1 Like

Another is to make the spec a tutorial on the formal notation.

I very much support this idea. Having an executable specification will help with verification efforts and proving invariants. The YP does a good job of being self-contained, being a tutorial in its own notation. I don’t see why the JP couldn’t do the same thing – making itself usable to anyone with a year or so of undergraduate study.

2 Likes

That you for all the comments above about K-EVM, both the positive and the negative ones! Of course, as creators and everyday users of KEVM it is hard for us at RV (http://runtimeverification.com) to see how it fails short, which is why it is so important to have feedback like here. Here is another thing that may be useful to consider:

  • At my knowledge, there are three other companies besides RV using KEVM in their smart contract execution or auditing. In particular, we have 2-3 smart contracts under “formal auditing” at any given moment. So there is a huge incentive to keep it updated and current! This may not happen with a paper model/semantics, because that is not directly used. Additionally, there are several formal tools based on it: an interpreter (which btw way, will be soon much faster, because we transitioned from OCAML to LLVM as target language we generate from K), a bounded symbolic state-space explorer, and a program verifier. And tens of thousands of tests for each of these that we execute in our CI. These tend to find bugs quickly in the spec, for the benefit of all. Only one tool, say an execution engine, can only find a particular subset of bugs but not all. For example, what if your spec happens to be non-deterministic? You may not find that with the interpreter but can find that with the state exploration tool (model checker).

If I understand it well, the main criticism is the lack of readability. I must confess that we put very little to almost no effort into making it readable! Our main focus was to make it precise and have all the tools work with it. This is good in my view, because we can address the readability part. It may only be a matter of having a good technical writer spend 2 months on it and turn it into a book/tutorial in literate programming style. Or two writers, a technical and a non-technical. We could do this if there is interest.

7 Likes

Hello, Dominik here from Gnosis, one of the companies that uses K-EVM internally. I recently completed the K onboarding, thought I’d share my experience:

  1. There is a pretty good K tutorial available. The only prerequisite that I remember is lambda calculus, but looking back, it is totally possible to skip 1_lambda and 3_lambda++ and complete only 2_imp and 4_imp++ to grasp K.
  2. I have read through the entire K-EVM specification. I disagree with @grosu that it is not readable. :slight_smile: If you have any experience with the functional paradigm in Haskell or even Javascript, it is very readable, and even if one doesn’t, it is very easy to get into.
    What makes K readable in my experience is its explicitness - everything is defined in configuration, syntax and rules - as such the language definition is self-contained, which is super powerful. I personally use K-EVM as the canonical specification of the EVM - if I am unsure about the semantics of an opcode/gas calculation/whatever, I refer to that definition, because of its completeness and simplicity.
  3. I have also used K for formal verification. This, imho, is its weak point when it comes to usability. The kprove tool is quite difficult to use and I found I had to contact RV folks to get help iteratively. In our discussions, this is one thing they are working on, as well as many other things like producing easily verifiable proofs etc. (as they switch from Java to Haskell backend), so I have no doubt that this will improve shortly. But specification of EVM, rather than its programs, is the main topic of this discussion, and for that, nothing beats K

Talking about tutorials/onboarding, I made a Formal verification workshop not long ago. I would be happy to expand this to a full-fledge course (working with RV) if there is demand from the community!

Best,

5 Likes

Maybe we can schedule a workshop for Council of Paris? Or at least, that week, maybe during ETHCC or after? This is the kind of not-quite-council stuff that I am happy to help volunteer around if there is interest.

When you say “I found I had to contact RV folks” – this is the sort of thing where public questions (e.g. github issues or forum threads) would be super helpful. Many times, multiple people have the same question, and either through search or from following along with a discussion, can get answers and learn.

2 Likes

Totally agree. A workshop at Council of Paris would probably be the easiest option, who can I get in touch with? :slight_smile: I could give an introduction into K, K-EVM and deductive program verification using K-EVM!

2 Likes

An aside: Isn’t the Council of Paris just one day? Seems it might not be worth squeezing in many workshops. Better to spend most of our time on the important issues affecting the Fellowship as a whole.

@dteiml that would be me :wink: PM me / I’ll reach out and we can see about organizing.

I’m thinking a few events post ETHCC with smaller groups.