Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Init Airdrop Schema, Clean up OP Drop 1, Init OP Token Data #2622

Closed
wants to merge 56 commits into from

Conversation

MSilb7
Copy link
Contributor

@MSilb7 MSilb7 commented Feb 4, 2023

Brief comments on the purpose of your changes:

Cleaning up some naming & methodologies around OP Airdrop #1, adding basic token metadata for tracking, and initializing a generalized airdrop schema.

Note: There may be some downstream implications for how OP Airdrop 1 tables are named, they should've specified drop 1, ideally something like airdrop.op_airdrop_1_addresses.

Open to other schema-type modifications, since these things can happen on any chain.

Edit Feb 11 - This sprawled a bit, but also trying to unify the scattered OP-related folders within an 'OP' folder.

Side-Note: I wonder if there's some generic inputs we can set up for the MerkleClaim fork airdrop contracts (i.e. give the table name, field name for claimer, field name for amount, token address & that automatically populates the spell).

For Dune Engine V2

I've checked that:

General checks:

  • I tested the query on dune.com after compiling the model with dbt compile (compiled queries are written to the target directory)
  • I used "refs" to reference other models in this repo and "sources" to reference raw or decoded tables
  • if adding a new model, I added a test
  • the filename is unique and ends with .sql
  • each sql file is a select statement and has only one view, table or function defined
  • column names are lowercase_snake_cased
  • if adding a new model, I edited the dbt project YAML file with new directory path for both models and seeds (if applicable)
  • if wanting to expose a model in the UI (Dune data explorer), I added a post-hook in the JINJA config to add metadata (blockchains, sector/project, name and contributor Dune usernames)

Pricing checks:

  • coin_id represents the ID of the coin on coinpaprika.com
  • all the coins are active on coinpaprika.com (please remove inactive ones)

Join logic:

  • if joining to base table (i.e. ethereum transactions or traces), I looked to make it an inner join if possible

Incremental logic:

  • I used is_incremental & not is_incremental jinja block filters on both base tables and decoded tables
    • where block_time >= date_trunc("day", now() - interval '1 week')
  • if joining to base table (i.e. ethereum transactions or traces), I applied join condition where block_time >= date_trunc("day", now() - interval '1 week')
  • if joining to prices view, I applied join condition where minute >= date_trunc("day", now() - interval '1 week')

@dune-eng
Copy link

dune-eng commented Feb 4, 2023

Workflow run id 4093189292 approved.

@dune-eng
Copy link

dune-eng commented Feb 4, 2023

Workflow run id 4093189293 approved.

@dune-eng
Copy link

dune-eng commented Feb 4, 2023

Workflow run id 4093433004 approved.

@dune-eng
Copy link

dune-eng commented Feb 4, 2023

Workflow run id 4093433003 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4094161489 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4094161491 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4094186549 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4094186548 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4094211780 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4094211778 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4094217459 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4094217445 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4096964120 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4096964118 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4097008045 approved.

@dune-eng
Copy link

dune-eng commented Feb 5, 2023

Workflow run id 4097008043 approved.

@dune-eng
Copy link

dune-eng commented Mar 8, 2023

Workflow run id 4367116542 approved.

@dune-eng
Copy link

dune-eng commented Mar 8, 2023

Workflow run id 4367099481 approved.

@dune-eng
Copy link

dune-eng commented Mar 8, 2023

Workflow run id 4367341283 approved.

@dune-eng
Copy link

dune-eng commented Mar 8, 2023

Workflow run id 4367341290 approved.

@MSilb7
Copy link
Contributor Author

MSilb7 commented Mar 17, 2023

bumping this + there's also the op airdrop #2 address list which was too large to put in a table: https://github.com/ethereum-optimism/op-analytics/blob/main/reference_data/address_lists/op_airdrop2_addresses.csv

@dune-eng
Copy link

Workflow run id 4443507915 approved.

@dune-eng
Copy link

Workflow run id 4443507912 approved.

Copy link
Collaborator

@Hosuke Hosuke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Hosuke Hosuke added ready-for-final-review and removed WIP work in progress labels Mar 17, 2023
@jeff-dude jeff-dude added in review Assignee is currently reviewing the PR and removed ready-for-final-review labels Mar 19, 2023
@jeff-dude
Copy link
Member

bumping this + there's also the op airdrop #2 address list which was too large to put in a table: https://github.com/ethereum-optimism/op-analytics/blob/main/reference_data/address_lists/op_airdrop2_addresses.csv

as for large files, are you in contact with niles and ingest team for their upload process? that may be best, cc @nileslawrence

@jeff-dude
Copy link
Member

@MSilb7 can you help me understand the intention of this airdrop spell better?

  • i'm seeing them set up as incremental, but most have an end date. if there is an end date, what is the use case for incremental ongoing loads?
  • some models have end date variable, but no usage downstream
  • some models don't have end date variables, assuming that means claim window is still open?

i wonder if these all make sense to be tables, rather than incremental? and if there is an end date, a static tag can be applied in config to help our orchestration know that it doesn't need run on an ongoing basis?

i also thought maybe the intention was to be tracking the airdrop transfers over time, but the inner join to transactions base table on block_number and particular hash make me think it isn't ongoing.

@dune-eng
Copy link

Workflow run id 4492031679 approved.

@dune-eng
Copy link

Workflow run id 4492031689 approved.

@MSilb7
Copy link
Contributor Author

MSilb7 commented Mar 23, 2023

moving this down to draft in favor of @hildobby 's #2979 .

Keeping this up for reference & will move the OP Token Schema to a new PR

@dune-eng
Copy link

Workflow run id 4500579524 approved.

@dune-eng
Copy link

Workflow run id 4500579511 approved.

@MSilb7 MSilb7 marked this pull request as draft March 23, 2023 12:32
@MSilb7 MSilb7 mentioned this pull request Mar 23, 2023
15 tasks
@jeff-dude
Copy link
Member

closing this, can track in tagged PRs

@jeff-dude jeff-dude closed this Apr 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
in review Assignee is currently reviewing the PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants