Skip to content

Commit

Permalink
Introduce Spark contract
Browse files Browse the repository at this point in the history
  • Loading branch information
storojs72 committed Jul 6, 2023
1 parent 68261f6 commit d8b6752
Show file tree
Hide file tree
Showing 16 changed files with 487 additions and 0 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,8 @@ jobs:
python src/verifier/step1/step1-data-contract-gen.py compressed-snark.json > src/verifier/step1/Step1Data.sol
python src/verifier/step2/step2-data-contract-gen.py verifier-key.json compressed-snark.json > src/verifier/step2/Step2Data.sol
python src/verifier/step3/step3-data-contract-gen.py verifier-key.json compressed-snark.json > src/verifier/step3/Step3Data.sol
python src/verifier/step4/spark/spark-multi-evaluation-data-contract-gen.py verifier-key.json primary > src/verifier/step4/spark/SparkMultiEvaluationDataPrimary.sol
python src/verifier/step4/spark/spark-multi-evaluation-data-contract-gen.py verifier-key.json secondary > src/verifier/step4/spark/SparkMultiEvaluationDataSecondary.sol
- name: Run forge fmt on re-generated contracts
run: |
Expand Down
51 changes: 51 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,3 +55,54 @@ python src/verifier/step1/step1-data-contract-gen.py compressed-snark.json > src
python src/verifier/step2/step2-data-contract-gen.py verifier-key.json compressed-snark.json > src/verifier/step2/Step2Data.sol
python src/verifier/step3/step3-data-contract-gen.py verifier-key.json compressed-snark.json > src/verifier/step3/Step3Data.sol
```

# Integration tests and unit-tests

Unit-tests are located in `test/unit-tests`. These tests are being created while debugging and while deriving the Solidity initial implementation
of particular building block or step of Nova verifier. Currently, they use hardcoded values (over Pasta curves) including values of intermediate variables
obtained while running reference Nova verifier in Rust. Our goal is eventual having the tests (test vectors) that will use input only from committed JSONs,
without any relations on intermediate variables. Another direction for improving the unit-testing system is synthesizing the tests for verifier parameterized by particular curves (Pasta, Grumpkin, etc.). The [switch-json](https://github.com/lurk-lab/solidity-verifier/tree/switch-json) branch can provide some more concrete
details on this.

Integration tests are located in `test/integration-tests`. These tests rely on the infrastructure (namely running Ethereum node and RPC client that uploads proving data from JSONs to the global state). We use `anvil` as an Ethereum node and `cast` as RPC client from [Foundry](https://github.com/foundry-rs/foundry/tree/master) development framework. Since Nova's public parameters are quite big, we cannot use only internal VM's memory for keeping the entire verifier key, so for integration testing purposes, before executing particular verification, we perform public parameters loading into the blockchain. For example, for Spark multi-evaluation we can
perform verification using following algorithm:

1) (Terminal A) Run `anvil` node with maximum gas limit and deployed contract size (global state will be dumped into `state.json` file):
```
anvil --order fifo --gas-limit 18446744073709551615 --code-size-limit 18446744073709551615 --state state.json
```

2) (Terminal B) Deploy the Spark contract (located in `src/verifier/step4/spark/SparkMultiEvaluationContract.sol`) to running `anvil` instance:
```
forge script script/Spark.s.sol:SparkVerificationDeployer --fork-url http://127.0.0.1:8545 --private-key 0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80 --broadcast
```

3) (Terminal B) Load the data from the precommitted `verifier-key.json` into the blockchain:
```
python test/integration-tests/integration-tests.py verifier-key.json
```

Note, that private key and deployed contract address are hardcoded inside `integration-tests.py`.

4) (Terminal A) Stop `anvil` node and alter the state using `test/integration-tests/state-modifier.py` (the last argument is the address of deployed contract):
```
Ctrl + C
python test/integration-tests/state-modifier.py state.json 0xe7f1725e7734ce288f8367e1bb143e90bb3f0512
```

This step is required currently, since there is a [confirmed](https://github.com/foundry-rs/foundry/issues/5302) bug in `cast` parser, that is why some values from JSONs are not correctly loaded into the blockchain,
which may cause verification failures. So, `state-modifier.py` just replaces some values in `state.json` that were manually detected by careful analysis to the expected ones. Probably with another JSONs, more such values will be detected.

5) (Terminal A) Run `anvil` node once again, using altered state:
```
anvil --order fifo --gas-limit 18446744073709551615 --code-size-limit 18446744073709551615 --state state.json
```

6) Finally execute primary and secondary verifications:

```
cast call 0xe7f1725e7734ce288f8367e1bb143e90bb3f0512 "verifyPrimary(uint256,uint256,uint256) (bool)" "0x07122b66b54727bf8bebec13052121d753589eb15040a49cb2ee5884810dc0a4" "0x339a352816f770e1bb7437e5cdd54bee76ed9ff13d1d7e9246f33e1a9dbc2656" "0x1ee416e56d10079af3a1785954078120077c6e428269fa00527b0e6a61d3d320" --private-key 0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80
cast call 0xe7f1725e7734ce288f8367e1bb143e90bb3f0512 "verifySecondary(uint256,uint256,uint256) (bool)" "0x267c7eb46d40984b873837e3eb10319b67245557e8f49efceaed4836f1cc05ee" "0x08250c7a9ba4b363fde20f4f77a5d5634401c952e71556af1d17682322153b43" "0x0a01229ad7bbad1e74f05f55c482b1e9ecf2a8b81ed95a4cecc9d78c8f925224" --private-key 0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80
```

You should see `true` in both outputs.
13 changes: 13 additions & 0 deletions script/Spark.s.sol
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
// SPDX-License-Identifier: Apache-2.0
pragma solidity ^0.8.0;

import "@std/Script.sol";
import "src/verifier/step4/spark/SparkMultiEvaluationContract.sol";

contract SparkVerificationDeployer is Script {
function run() external {
vm.startBroadcast();
new SparkMultiEvaluationContract();
vm.stopBroadcast();
}
}
163 changes: 163 additions & 0 deletions src/verifier/step4/spark/SparkMultiEvaluationContract.sol
Original file line number Diff line number Diff line change
@@ -0,0 +1,163 @@
// SPDX-License-Identifier: Apache-2.0
pragma solidity ^0.8.0;

import "src/pasta/Pallas.sol";
import "src/pasta/Vesta.sol";
import "src/verifier/step4/EqPolynomial.sol";

contract SparkMultiEvaluationContract {
struct MatrixData {
uint32 i;
uint32 j;
uint256 scalar;
}

uint256[] public r_x_primary;
uint256[] public r_y_primary;
uint256[] public r_x_secondary;
uint256[] public r_y_secondary;

MatrixData[] public A_primary;
MatrixData[] public B_primary;
MatrixData[] public C_primary;
MatrixData[] public A_secondary;
MatrixData[] public B_secondary;
MatrixData[] public C_secondary;

function pushToRxPrimary(uint256[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
r_x_primary.push(input[index]);
}
}

function pushToRyPrimary(uint256[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
r_y_primary.push(input[index]);
}
}

function pushToAprimary(MatrixData[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
A_primary.push(input[index]);
}
}

function pushToBprimary(MatrixData[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
B_primary.push(input[index]);
}
}

function pushToCprimary(MatrixData[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
C_primary.push(input[index]);
}
}

function pushToRxSecondary(uint256[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
r_x_secondary.push(input[index]);
}
}

function pushToRySecondary(uint256[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
r_y_secondary.push(input[index]);
}
}

function pushToAsecondary(MatrixData[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
A_secondary.push(input[index]);
}
}

function pushToBsecondary(MatrixData[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
B_secondary.push(input[index]);
}
}

function pushToCsecondary(MatrixData[] calldata input) public {
for (uint256 index = 0; index < input.length; index++) {
C_secondary.push(input[index]);
}
}

function verifyPrimary(uint256 expectedA, uint256 expectedB, uint256 expectedC) public view returns (bool) {
(uint256 evalsA, uint256 evalsB, uint256 evalsC) = multiEvaluatePrimary();
return verifyInner(evalsA, evalsB, evalsC, expectedA, expectedB, expectedC);
}

function verifySecondary(uint256 expectedA, uint256 expectedB, uint256 expectedC) public view returns (bool) {
(uint256 evalsA, uint256 evalsB, uint256 evalsC) = multiEvaluateSecondary();
return verifyInner(evalsA, evalsB, evalsC, expectedA, expectedB, expectedC);
}

function multiEvaluateSecondary() private view returns (uint256, uint256, uint256) {
uint256[] memory T_x = EqPolinomialLib.evalsPallas(r_x_secondary);
uint256[] memory T_y = EqPolinomialLib.evalsPallas(r_y_secondary);

uint256 result_C_secondary = multiEvaluateInner(C_secondary, T_x, T_y, Pallas.P_MOD);
uint256 result_B_secondary = multiEvaluateInner(B_secondary, T_x, T_y, Pallas.P_MOD);
uint256 result_A_secondary = multiEvaluateInner(A_secondary, T_x, T_y, Pallas.P_MOD);

return (result_A_secondary, result_B_secondary, result_C_secondary);
}

function multiEvaluatePrimary() private view returns (uint256, uint256, uint256) {
uint256[] memory T_x = EqPolinomialLib.evalsVesta(r_x_primary);
uint256[] memory T_y = EqPolinomialLib.evalsVesta(r_y_primary);

uint256 result_C_primary = multiEvaluateInner(C_primary, T_x, T_y, Vesta.P_MOD);
uint256 result_B_primary = multiEvaluateInner(B_primary, T_x, T_y, Vesta.P_MOD);
uint256 result_A_primary = multiEvaluateInner(A_primary, T_x, T_y, Vesta.P_MOD);

return (result_A_primary, result_B_primary, result_C_primary);
}

function verifyInner(
uint256 actualA,
uint256 actualB,
uint256 actualC,
uint256 expectedA,
uint256 expectedB,
uint256 expectedC
) private pure returns (bool) {
if (actualA != expectedA) {
return false;
}
if (actualB != expectedB) {
return false;
}
if (actualC != expectedC) {
return false;
}
return true;
}

function multiEvaluateInner(MatrixData[] memory input, uint256[] memory T_x, uint256[] memory T_y, uint256 modulus)
private
pure
returns (uint256)
{
uint256 result = 0;
uint256 val = 0;
uint256 T_row = 0;
uint256 T_col = 0;

for (uint256 i = 0; i < input.length; i++) {
T_row = T_x[input[i].i];
T_col = T_y[input[i].j];
val = input[i].scalar;

assembly {
val := mulmod(T_row, val, modulus)
val := mulmod(T_col, val, modulus)
result := addmod(result, val, modulus)
}
}

return result;
}
}
Loading

0 comments on commit d8b6752

Please sign in to comment.