ERC-3722 -- Poster: A ridiculously simple general purpose social media smart contract

Hey, Jose, nice to meet you! Awesome work with the app!

Maybe I can ask you about this. Right now, the subgraph is just storing the “content” value as a single “rawContent” string, right? That means the subgraph isn’t actually storing the state of the app at all, right? As in, in order to make an app like Twitter work, including user actions like following, blocking, etc. you would need to implement something outside the subgraph that queries all the Posts in the subgraph, parses all the “rawContent” strings, and then constructs the entire state of the app. This can be done upon loading a react app, but it seems like it will get out of hand really fast, even with just the Twitter use case, once things like follows get implemented.

It could also be done by a separately hosted service that watches the graph, but wouldn’t it be better to construct the app state in the subgraph, by parsing and processing the content in the events’ content field there? With the graph’s JSON API, it looks like that should be possible (except that it parses bytes instead of strings). That’s what I was assuming we would be doing based on this discussion, your diagram, etc., but it doesn’t look like that’s what’s happening atm. Am I missing something?

Yes, this is how I had envisioned the subgraph working as well.

Indeed, right now the latest subgraph does not consider any of the specifications provided in the original post. This is (now) being tracked in the following issues:

After sleeping through it and giving it some thought (on top of the questions given by @ezra_w), I now see that the most viable solution is indeed building the entire state in the subgraph based on on-chain data as originally suggested. Here’s why:


All actions sent via Poster (A) can be parsed in a single run by a Subgraph (B), generating a global state. New actions will only need to update the delta of the global state, always showing the latest state.

In short, by keeping everything on-chain, we can always build the global state via the subgraph and allowing minimal operations on the client, with minimal overhead. Furthermore, we can display chunks of the global state in a granular way, without having to rebuild the state for a particular view (e.g. my likes). And last but not least, composability; the subgraph in charge of the entire social network can be composed on top, allowing anyone to create sub-networks or other products on top of the whole thing. It’s like if Twitter entire content would be open source and ready to compose on.

So now we are on the same ship, and will move on with the UI modifications :slight_smile: @ezra_w since you showed some interest in the subgraph part, are you up for modifying the existing one to allow building the needed state? I’ll tackle that next after I’m done with the UI, but would be good to get some help there.

P.S. I actually reached the Ceramic team and got interesting insights, in particular:

  • The cost of every single operation (ie every write needing some gas), and
  • The lack of a double-spend or global consensus need in a content publishing platform.

The first one I could argue L2 layers can tackle that, not to mention that we could put some mechanisms in place to diminish or offset the cost of publishing (e.g. via OpenGSN relayers). An example is how Mirror requires $WRITE tokens to publish, but we can open that pandora box later.

The second aspect is indeed true, assuming content has no value, which I could argue it has, so agreeing on who wrote what when first does require a global consensus. Furthermore, streams (their append-only data model) would require composing the global state somewhere, forcing clients to do that overhead every time and/or via some trustable gateway, killing decentralization.

In short, I love the Ceramic project but I do not think it fits our current use case at the moment. I’ll still explore it for IDX to allow enriching the profile data, but will do it in a separate fashion to Poster.

Back to building!

3 Likes

Nice! I already started some work in this direction for the forum here: GitHub - EzraWeller/postum: Decentralized protocol for forums implemented on the Poster contract

It is not well documented (to say the least) nor finished at the moment, so it may be difficult to look through, but the subgraph folder has an example of what we’re talking about: it parses the incoming content, assuming it’s stringified JSON (awkwardly converting the string to Bytes first b/c that’s what The Graph’s JSON API wants), then constructs state based on that. I created some JSON schema for the forum actions you can find in the json-schema folder.

I think I’m most excited about the forum use-case at the moment, since it seems like there’s an immediate audience for a decentralized version of Discourse in the crypto community, but I’d be happy to help with the Twitter subgraph / other infra as well.

Edit: Got some of the basics working (creating forums, editing forum titles).

3 Likes

Just made a PR into the EIP repo

3 Likes

Hey @ezra_w just wanted to thank you for the tip on try_fromBytes for parsing the JSON content sent via the smart contract. You were indeed correct on the fact that currently it’s the only feasible way to parse JSON atm, and our subgraph now has implemented your concept.

As I can see from Postum’s README, you also are trying to support emoji. Have you identified why this breaks the JSON parser? I’ve been running a few tests and I’m not 100% sure why sometimes it just fails to parse the JSON file altogether. Here are some examples:

For some reason, @auryn’s post (0xd714dd) broke the parser, but when I tried to replicate the same content multiple times I didn’t manage. It could be that I used ' instead of but if we have to edge case the all utf-8 encoding points this is going to not so much fun.

I’ll poke some of the TheGraph team members and try to figure out what’s going on there.

1 Like

Nice! I actually noticed that Dennison’s subgraph avoided the awkward stringToBytes function I wrote with this let tryData = json.try_fromBytes(ByteArray.fromUTF8(call.inputs.content) as Bytes), but no variation of that has worked for me. Would rather use his version if we can, obviously!

Wrt the emojis, I haven’t really looked into it yet. Let me know if you figure it out.

Here is the latest Postum repo, now living in the EthPoster org: GitHub - ETHPoster/postum
Most of the code for a working backend is done. It needs better test coverage (and a frontend to try it out), but I think a lot of the approach can be directly applied to the twitter app (or pretty much any other use case, for that matter). Once I feel confident in the Postum stuff, I’m happy to work on similar structures for the twitter app, if that’s helpful.

1 Like

Got this working and it also seems to have fixed emoji support :slight_smile: Just needed to futz with the JSON parsing some more.

2 Likes

I am skeptical that it is feasible from a cost perspective to save single posts onchain, even at layer 2.

This was tried with Leeroy and Peepeth and costs of transactions (which were a lot lower than today) were a barrier to posting.

Peepeth went down the batching route, where each post is signed by the author and batched by the platform, with only an IPFS hash of the batch being saved onchain. This is required to have a trustless timestamp for batched posts.

Services could batch posts to Layer 2 and storing on decentralized storage every minute and the costs would be greatly reduced, potentially enough that a subscription service could be used.

3 Likes

Noticed that on your latest commit in Postum! I’ll migrate mine after the ETHGlobal judging takes place.

3 Likes

Had been orbiting around a few social media projects lately and I see that most agree that on a cost/ux 1 tx = 1 post is not feasible on the current schema of things. My current take goes as follows:

  • Costs can be mitigated by a super cheap network. The cost and gas calculation statements were true back in 2018, but now with projects that make a transaction cost a fraction of a penny, even paying for all non-sybil users doesn’t seem crazy. Even on the latest ones on top of mainnet (e.g. Arbitrum or Optimism) these costs are minimal.
  • Rollups and bundles can batch transactions. As you mention, batch transactions seem like an optimal option, some sort of hop-on-bus that people can append their transaction to. There’s a lot of work that has been done since 2018 on that area, and I’m super keen in exploring that.
  • Signing/IPFS and off-chain permanence. I understand Peepeth’s approach, but the only thing I struggle with is the data permanence. If you are only signing a hash, and that hash is removed from the network, that will not be really immutable. I think there’s a difference between a truly immutable social network and one that is not. Ceramic seems to have a good “in the middle” solution which allows pinning on the blockchain, but I also need to explore that.

There are other projects exploring other alternatives which I’m also curious to see how they evolve and solve these problems.

3 Likes

There is no reason a pattern like Peepeth’s couldn’t be built on top of Poster. You could emit an IPFS hash and then it’s really just up to clients to interpret the hash. But, as @jjperezaguinaga points out, there is a potential persistence issue if the data is not pinned by someone.

Also, even if you are putting the content on chain, it is not necessarily limited to just one post.
I don’t think the subgraph properly handles it yet, but the json specification in the OP is designed to allow multiple posts in a single transaction.

3 Likes

I don’t think there would be a long-term persistence issue here as long as we can read IPFS from the subgraph: the poster contract emits an event with an IPFS hash, which needs to be available long enough for the subgraph to retrieve the data from IPFS, which it then processes as a bunch of events – subgraph then becomes the long term state storage and we don’t really need the IPFS hash anymore.

I think the main issue here is the centralization of packaging a bunch of signed messages and putting them on IPFS – without some decentralized protocol for that, you have to trust the packager not to censor anyone’s messages. This is similar to Snapshot or Gnosis Safe today, so it’s not an issue that would be unique for us.

For now, I’m hopeful we could deploy this on some layer 2s that are essentially free – we need way less security for these media use cases than for DeFi, I think, which hopefully means we can go straight to the cheapest censorship-resistant L2s.

2 Likes

I’ve been thinking about this some more and I think it’s worthwhile adding a new operation (permissions) and a new field (from) to the json standard.

A post to grant all permissions to an account.

{
  "content": [
    {
      "type": "permissions",
      "account": "<account_to_set_permissions>",
      "permissions": {
        "post": true,
        "delete": true,
        "like": true,
        "follow": true,
        "block": true,
        "report": true,
        "permissions": true
      }
    }
  ]
}

A post from an account with permissions to post on behalf of another account. If the ‘from’ account has given permission to this account, clients should treat this message as if it came from the ‘from’ address.

{
  "content": [
    {
      "type": "microblog",
      "text": "This is a post from an account with permissions to post on behalf of another account.",
      "from": "<from_address>"
    }
  ]
}

Perhaps some version of this should be PIP-0002? (assuming PIP-0001 is a template similar to EIP-1)

2 Likes

Editing (i.e. update) of posts should be included in the standard for posting.

An update is a new post with a reference to a previous post stating that it is an update to replace it. Multiple updates should be allowed.

Clients can show only the latest update to a post (and optionally show previous posts).

2 Likes

Great suggestion! I’ll add this to the standard.

1 Like

Have you all checked out my set of JSON schema for Postum? I’m using this pattern:

{
  "action": "CREATE_FORUM",
  "args": {...}
}

With a different schema for each database mutation, e.g. create/edit/delete forum/thread/post. This seems like the right general framework for Poster apps to me.

edit: I guess maybe I’m suggesting a PIP here? Not really keyed into that system at this point, but it would be something like this: content should use the above format or similar (action, args) and the poster community should maintain an action namespace (i.e. only one “CREATE_FORUM” schema), to help make poster apps more interoperable.

1 Like

This seems reasonable to me. Do you see this as conflicting with the proposed json schema for twitter-like posts?

Maybe? I guess we should set up an EIP-like repo for PIPs.

Some thoughts:

  • Is it really okay to “store” the message in blocks but not in the state? As the chain lives longer and longer, there might be less and less people storing the whole block history. I’m confident there will still be people and companies doing it, but when the storage reaches tens of terabytes, it might be highly skewed towards companies offering paid API access (arguably this can be unbundled, decentralized too though). It might be okay, but it’s worth thinking about explicitly. Otoh, storing in the state makes the costs prohibitive.

  • Has someone looked at how Arweave incentivizes storage? They seem to have done some thinking of that front. Maybe something similar can be reproduced / incentivized by a contract?

1 Like

A different type of general purpose contract can be made via unicodes. Emojis, logographic symbols can be send via this contract as well.

contract Poster {
    event NewPost(address indexed account, uint256[] unicodes);

    function post(uint256[] calldata unicodes) public {
        emit NewPost(msg.send, unicodes);
    }
}
1 Like