Skip to content

Commit

Permalink
Merge branch 'main' into xcopy-fix
Browse files Browse the repository at this point in the history
  • Loading branch information
jeff-dude authored Jan 2, 2025
2 parents 47bab71 + 01fb851 commit fffb216
Show file tree
Hide file tree
Showing 83 changed files with 1,297 additions and 196 deletions.
6 changes: 3 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Contributions in the form of issues and pull requests are very much welcome here

## [BETA] Pre-push hooks

UPDATE: These pre-push hooks require running `dbt compile` which is a fairly slow step due to the size of our project. We intend to rewrite these hooks to be more efficient but for the time being they remain cumbersome. Feel free to use them if you find them useful but the same checks will run in a Github Action when you commit your code. Feel free to uninstall if they do not bring joy, we'll let wizards know when we think we've improved them enought to warrant making them part of the general development flow.
UPDATE: These pre-push hooks require running `dbt compile` which is a fairly slow step due to the size of our project. We intend to rewrite these hooks to be more efficient but for the time being they remain cumbersome. Feel free to use them if you find them useful but the same checks will run in a Github Action when you commit your code. Feel free to uninstall if they do not bring joy, we'll let wizards know when we think we've improved them enough to warrant making them part of the general development flow.

We are testing out adding pre-push hooks to our workflow. The goal is to catch common errors before code is pushed and
streamline the pull request review process.
Expand Down Expand Up @@ -126,13 +126,13 @@ example custom test:
```sql
with unit_test1 as
(select
case when col1 == 2 and col2 == 'moon' then True else False end as test
case when col1 = 2 and col2 = 'moon' then True else False end as test
from {{ ref('mock_table' )}}
where tx_id = '102'),

unit_test2 as
(select
case when col1 == 2 and col2 == 'moon' then True else False end as test
case when col1 = 2 and col2 = 'moon' then True else False end as test
from {{ ref('mock_table' )}}
where tx_id = '103'),

Expand Down
4 changes: 2 additions & 2 deletions Pipfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ name = "pypi"
numpy = "2.0.12"
pre-commit = "2.20.0"
pytest = "7.1.3"
dbt-trino = "1.8.2"
dbt-trino = "1.9.0"

[requires]
python_version = "3.9"
python_version = "3.9"
209 changes: 98 additions & 111 deletions Pipfile.lock

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,7 @@
'PenguinStrategyForLPb',
'PlatypusStrategy',
'SonicStrategyForSA',
'StableJackStrategy',
'StableVaultStrategyForS3D',
'StableVaultStrategyForS3F',
'StargateStrategyForLP',
Expand All @@ -115,6 +116,7 @@
'StormStrategyForSA',
'SuStrategyV2',
'SynapseStrategy',
'TokenMillStrategy',
'UnipoolStrategyV1',
'UspPlatypusStrategy',
'Vault',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,13 @@ existing_contracts AS (
{% endif -%}

new_transfers AS (
SELECT
s.contract_address
, s.block_time
, s.block_number
, s.user_address
, s.net_transfer_amount
FROM (
{%- for strategy in yield_yak_strategies(blockchain) %}
SELECT
s.contract_address
Expand All @@ -40,19 +47,8 @@ new_transfers AS (
, SUM(u.net_transfer_amount) AS net_transfer_amount
FROM {{ source(namespace_blockchain, strategy + '_evt_Transfer') }} s
CROSS JOIN UNNEST(ARRAY[s."from", s.to], ARRAY[-1 * CAST(s.value AS INT256), CAST(s.value AS INT256)]) AS u(user_address, net_transfer_amount)
{%- if is_incremental() %}
LEFT JOIN existing_contracts c
ON c.contract_address = s.contract_address
WHERE
(({{ incremental_predicate('s.evt_block_time') }}
AND s.evt_block_time > c.max_from_time)
OR c.contract_address IS NULL) -- This line allows for new contract_addresses being appended that were not already included in previous runs but also allows their entire historical data to be loaded
AND s."from" != s."to"
{%- endif %}
{%- if not is_incremental() %}
WHERE
s."from" != s."to"
{%- endif %}
GROUP BY
s.contract_address
, s.evt_block_time
Expand All @@ -63,6 +59,15 @@ new_transfers AS (
UNION ALL
{%- endif -%}
{%- endfor %}
) s
{%- if is_incremental() %}
LEFT JOIN existing_contracts c
ON c.contract_address = s.contract_address
WHERE
({{ incremental_predicate('s.block_time') }}
AND s.block_time > c.max_from_time)
OR c.contract_address IS NULL -- This line allows for new contract_addresses being appended that were not already included in previous runs but also allows their entire historical data to be loaded
{%- endif %}
),

combined_table AS (
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ Because of that, the `alm.trades` design mimics the one used for `dex.trades`.

### Data Flow Architecture

As previously said, the architecture of `alm.trades` mimics the one of `dex.trades`. Because of that, [this diagram](https://github.com/duneanalytics/spellbook/blob/main/models/_sector/dex/readme.md#data-flow-architecture) can be taken as a reference.
As previously said, the architecture of `alm.trades` mimics the one of `dex.trades`. Because of that, [this diagram](https://github.com/duneanalytics/spellbook/blob/main/dbt_subprojects/dex/models/trades/readme.md) can be taken as a reference.
In `alm.trades`, the 2 macros used are:
- `arrakis_compatible_v2_trades`: which tracks all the Uniswap V3 LP positions (timestamp, liquidity, and tick information) minted by Arrakis Finance vaults, and then derives the volume served for each swap based on the price movement of the pool.
- `add_pool_price_usd`: which uses `prices.usd` to populate the pool price in USD, so the volume served can be expressed in USD terms.
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,222 @@
{{config(
alias = 'beets_pools_sonic',
post_hook = '{{ expose_spells(\'["sonic"]\',
"sector",
"labels",
\'["viniabussafi"]\') }}'
)}}

WITH v2_pools AS(
WITH pools AS (
SELECT
pool_id,
zip.tokens AS token_address,
zip.weights / pow(10, 18) AS normalized_weight,
symbol,
pool_type,
pool_name
FROM (
SELECT
c.poolId AS pool_id,
t.tokens,
w.weights,
cc.symbol,
'weighted' AS pool_type,
cc.name AS pool_name
FROM {{ source('beethoven_x_v2_sonic', 'Vault_evt_PoolRegistered') }} c
INNER JOIN {{ source('beethoven_x_v2_sonic', 'WeightedPoolFactory_call_create') }} cc
ON c.evt_tx_hash = cc.call_tx_hash
AND bytearray_substring(c.poolId, 1, 20) = cc.output_0
CROSS JOIN UNNEST(cc.tokens) WITH ORDINALITY t(tokens, pos)
CROSS JOIN UNNEST(cc.normalizedWeights) WITH ORDINALITY w(weights, pos)
WHERE t.pos = w.pos
) zip


UNION ALL

SELECT
c.poolId AS pool_id,
t.tokens AS token_address,
0 AS normalized_weight,
cc.symbol,
'stable' AS pool_type,
cc.name AS pool_name
FROM {{ source('beethoven_x_v2_sonic', 'Vault_evt_PoolRegistered') }} c
INNER JOIN {{ source('beethoven_x_v2_sonic', 'ComposableStablePoolFactory_call_create') }} cc
ON c.evt_tx_hash = cc.call_tx_hash
AND bytearray_substring(c.poolId, 1, 20) = cc.output_0
CROSS JOIN UNNEST(cc.tokens) AS t(tokens)
),

settings AS (
SELECT
pool_id,
coalesce(t.symbol, '?') AS token_symbol,
normalized_weight,
p.symbol AS pool_symbol,
p.pool_type,
p.pool_name
FROM pools p
LEFT JOIN {{ source('tokens', 'erc20') }} t ON p.token_address = t.contract_address
AND t.blockchain = 'sonic'
)

SELECT
'sonic' AS blockchain,
bytearray_substring(pool_id, 1, 20) AS address,
CASE WHEN pool_type IN ('stable')
THEN lower(pool_symbol)
ELSE lower(concat(array_join(array_agg(token_symbol ORDER BY token_symbol), '/'), ' ',
array_join(array_agg(cast(norm_weight AS varchar) ORDER BY token_symbol), '/')))
END AS name,
pool_name AS poolname,
pool_type,
'2' AS version,
'beets_v2_pool' AS category,
'beets' AS contributor,
'query' AS source,
TIMESTAMP'2024-12-15 00:00' AS created_at,
now() AS updated_at,
'beets_pools_sonic' AS model_name,
'identifier' AS label_type
FROM (
SELECT
s1.pool_id,
token_symbol,
pool_symbol,
cast(100 * normalized_weight AS integer) AS norm_weight,
pool_type,
pool_name
FROM settings s1
GROUP BY s1.pool_id, token_symbol, pool_symbol, normalized_weight, pool_type, pool_name
) s
GROUP BY pool_id, pool_symbol, pool_type, pool_name
ORDER BY 1),

v3_pools AS(
WITH token_data AS (
SELECT
pool,
ARRAY_AGG(FROM_HEX(json_extract_scalar(token, '$.token')) ORDER BY token_index) AS tokens
FROM (
SELECT
pool,
tokenConfig,
SEQUENCE(1, CARDINALITY(tokenConfig)) AS token_index_array
FROM {{ source('beethoven_x_v3_sonic', 'Vault_evt_PoolRegistered') }}
) AS pool_data
CROSS JOIN UNNEST(tokenConfig, token_index_array) AS t(token, token_index)
GROUP BY 1
),

pools AS (
SELECT
pool_id,
zip.tokens AS token_address,
zip.weights / POWER(10, 18) AS normalized_weight,
symbol,
pool_type
FROM (
SELECT
c.pool AS pool_id,
t.tokens,
w.weights,
cc.symbol,
'weighted' AS pool_type
FROM token_data c
INNER JOIN {{ source('beethoven_x_v3_sonic', 'WeightedPoolFactory_call_create') }} cc
ON c.pool = cc.output_pool
CROSS JOIN UNNEST(c.tokens) WITH ORDINALITY t(tokens, pos)
CROSS JOIN UNNEST(cc.normalizedWeights) WITH ORDINALITY w(weights, pos)
WHERE t.pos = w.pos

UNION ALL

SELECT
c.pool AS pool_id,
t.tokens,
0 AS weights,
cc.symbol,
'stable' AS pool_type
FROM token_data c
INNER JOIN {{ source('beethoven_x_v3_sonic', 'StablePoolFactory_call_create') }} cc
ON c.pool = cc.output_pool
CROSS JOIN UNNEST(c.tokens) AS t(tokens)
) zip
),

settings AS (
SELECT
pool_id,
coalesce(t.symbol, '?') AS token_symbol,
normalized_weight,
p.symbol AS pool_symbol,
p.pool_type
FROM pools p
LEFT JOIN {{ source('tokens', 'erc20') }} t ON p.token_address = t.contract_address
AND t.blockchain = 'sonic'
)

SELECT
'sonic' AS blockchain,
bytearray_substring(pool_id, 1, 20) AS address,
CASE WHEN pool_type IN ('stable')
THEN lower(pool_symbol)
ELSE lower(concat(array_join(array_agg(token_symbol ORDER BY token_symbol), '/'), ' ',
array_join(array_agg(cast(norm_weight AS varchar) ORDER BY token_symbol), '/')))
END AS name,
pool_type,
'3' AS version,
'beets_v3_pool' AS category,
'beets' AS contributor,
'query' AS source,
TIMESTAMP'2024-12-15 00:00' AS created_at,
now() AS updated_at,
'beets_pools_sonic' AS model_name,
'identifier' AS label_type
FROM (
SELECT
s1.pool_id,
token_symbol,
pool_symbol,
cast(100 * normalized_weight AS integer) AS norm_weight,
pool_type
FROM settings s1
GROUP BY s1.pool_id, token_symbol, pool_symbol, normalized_weight, pool_type
) s
GROUP BY pool_id, pool_symbol, pool_type
ORDER BY 1
)

SELECT
blockchain,
address,
name,
pool_type,
version,
category,
contributor,
source,
created_at,
updated_at,
model_name,
label_type
FROM v2_pools

UNION

SELECT
blockchain,
address,
name,
pool_type,
version,
category,
contributor,
source,
created_at,
updated_at,
model_name,
label_type
FROM v3_pools
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
version: 2

models:
- name: labels_beets_pools_sonic
meta:
blockchain: sonic
sector: labels
project: beets
contributors: viniabussafi
config:
tags: ['labels', 'fantom', 'balancer', 'pools']
description: 'Beets liquidity pools created on Sonic.'
data_tests:
- dbt_utils.unique_combination_of_columns:
combination_of_columns:
- address
- name
- category
- model_name
- blockchain
columns:
- &blockchain
name: blockchain
description: 'Blockchain'
- &address
name: address
description: 'Address of liquidity pool'
- &name
name: name
description: 'Label name of pool containg the token symbols and their respective weights (if applicable)'
- &poolname
name: poolname
description: 'Label name of pool set at contract creation'
- &category
name: category
description: 'Label category'
- &contributor
name: contributor
description: 'Wizard(s) contributing to labels'
- &source
name: source
description: 'How were labels generated (could be static or query)'
- &created_at
name: created_at
description: 'When were labels created'
- &updated_at
name: updated_at
description: "When were labels updated for the last time"
- &model_name
name: model_name
description: "Name of the label model sourced from"
- &label_type
name: label_type
description: "Type of label (see labels overall readme)"
Loading

0 comments on commit fffb216

Please sign in to comment.