Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DataCap Application] Column Sonar Data Archive on AWS #57

Open
1 of 2 tasks
lixue-one opened this issue Dec 17, 2024 · 66 comments
Open
1 of 2 tasks

[DataCap Application] Column Sonar Data Archive on AWS #57

lixue-one opened this issue Dec 17, 2024 · 66 comments
Labels

Comments

@lixue-one
Copy link

lixue-one commented Dec 17, 2024

Version

1

DataCap Applicant

Column Sonar Data Archive on AWS

Project ID

CSDA-002

Data Owner Name

Column Sonar Data Archive on AWS

Data Owner Country/Region

United States

Data Owner Industry

Life Science / Healthcare

Website

https://www.ncei.noaa.gov/maps/water-column-sonar/; https://cires.gitbook.io/ncei-wcsd-archive

Social Media Handle

https://www.ncei.noaa.gov/maps/water-column-sonar/; https://cires.gitbook.io/ncei-wcsd-archive

Social Media Type

Slack

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

10PiB

Expected size of single dataset (one copy)

2PiB

Number of replicas to store

5

Weekly allocation of DataCap requested

512TiB

On-chain address for first allocation

f16wp5agwmggn5r23fqf5tq7xxu3hx6dgem7tnjry

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

  • Use Custom Multisig

Identifier

No response

Share a brief history of your project and organization

NOAA collects and uses active acoustic (or sonar) data for a variety of mapping requirements. Water column sonar data focus on the area from near the surface of the ocean to the seafloor. Primary uses of these specific sonar data include 3-D mapping of fish schools and other mid-water marine organisms; assessing biological abundance; species identification; and habitat characterization. Other uses include mapping underwater gas seeps and remotely monitoring undersea oil spills. NCEI archives water column sonar data collected by NOAA line offices, academia, industry, and international institutions. Use the CruisePack data packaging tool to submit water column sonar data to the archive.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

Water column sonar data focus on the area from near the surface of the ocean to the seafloor. Primary uses of these specific sonar data include 3-D mapping of fish schools and other mid-water marine organisms; assessing biological abundance; species identification; and habitat characterization. Other uses include mapping underwater gas seeps and remotely monitoring undersea oil spills. NCEI archives water column sonar data collected by NOAA line offices, academia, industry, and international institutions. Use the CruisePack data packaging tool to submit water column sonar data to the archive.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

None

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

About preparing data, we have learned and tried to use lotus code from Filecoin network.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No response

Please share a sample of the data

aws s3 ls --no-sign-request s3://noaa-wcsd-pds/

S3 Bucket: "noaa-wcsd-pds"
├── data
│   ├── processed
│   │   ├── SH1305
│   │   │   ├── 18kHz
│   │   │   │   ├── SaKe_2013-D20130522-T134850.csv
│   │   │   │   ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│   │   │   │   ├── ...
│   │   │   ├── 38kHz
│   │   │   │   ├── ...
│   │   │   ├── 70kHz
│   │   │   │   ├── ...
│   │   │   ├── 120kHz
│   │   │   │   ├── ...
│   │   │   ├── 200kHz
│   │   │   │   ├── ...
│   │   │   ├── bottom
│   │   │   │   ├── SaKe_2013-D20130522-T134850.csv
│   │   │   │   ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│   │   │   │   ├── ...
│   │   │   ├── multifrequency
│   │   │   │   ├── SaKe_2013-D20130522-T134850.csv
│   │   │   │   ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│   │   │   │   ├── ...
│   │   │   ├── ...
│   │   ├── GU1002
│   │   │   ├── ...
│   │   ├── AL0502
│   │   │   ├── ...
│   │   ├── ...
│   ├── raw
│   │   ├── Bell_M_Shimada
│   │   │   ├── SH1305
│   │   │   │   ├── EK60
│   │   │   │   │   ├── SaKe_2013-D20130623-T063450.raw
│   │   │   │   │   ├── SaKe_2013-D20130623-T064452.raw
│   │   │   │   │   ├── SaKe_2013-D20130623-T064452.bot

Confirm that this is a public dataset that can be retrieved by anyone on the Network

  • I confirm

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, Europe

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), IPFS, Shipping hard drives, Lotus built-in data transfer

How did you find your storage providers

Slack, Partners

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

f02865213 United States
f02887063 United States
f02927642 Vietnam
f02953218 Russia
f03275720  United States

How do you plan to make deals to your storage providers

Boost client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

Copy link
Contributor

datacap-bot bot commented Dec 17, 2024

Application is waiting for allocator review

@lixue-one
Copy link
Author

@joshua-ne Any updates here?

@joshua-ne
Copy link
Owner

Hi, @lixue-one , thank you for your application. Most part of the application looks good to me, and for official record, I will post our offline work and my further questions here.

  1. We have done SP localization verification offline, as well as sample data check. Also we have done cross-reference to check the owner/worker/controller addresses of the SPs to make sure not all of the SPs are related to each other.

  2. I can see that the sample data is from NOAA aws, so will the SPs directly download data from there? Or you will preprocess them first and then transfer to SPs

  3. In terms of question "How do you plan to make deals to your storage providers", you listed boost client and lotus client. However, I think lotus client is already deprecated. Please confirm or remove this.

datacap-bot bot added a commit that referenced this issue Dec 22, 2024
@lixue-one
Copy link
Author

lixue-one commented Dec 22, 2024

Hey,@joshua-ne The data is downloaded by us and transmitted to the SP by express delivery. each SP has different technical capabilities, different bandwidths, and different download data, in order to keep the data backup at a reasonable level, we decided to use the hard disk mailing method.we had remove lotus client.
We are looking forward to seeing your further support for us. We are ready. All SPs support spark and the retrieval rate should be around 75%.

@joshua-ne
Copy link
Owner

Hi everything looks good on my end so far. So let's kick off with a small batch of DC first and see how will things go. Happy onboarding!

Copy link
Contributor

datacap-bot bot commented Dec 23, 2024

Datacap Request Trigger

Total DataCap requested

10PiB

Expected weekly DataCap usage rate

512TiB

DataCap Amount - First Tranche

50TiB

Client address

f16wp5agwmggn5r23fqf5tq7xxu3hx6dgem7tnjry

Copy link
Contributor

datacap-bot bot commented Dec 23, 2024

DataCap Allocation requested

Multisig Notary address

Client address

f16wp5agwmggn5r23fqf5tq7xxu3hx6dgem7tnjry

DataCap allocation requested

50TiB

Id

aaf32523-9c1d-46f2-807b-d0fe8de1ed8b

Copy link
Contributor

datacap-bot bot commented Dec 23, 2024

Application is ready to sign

Copy link
Contributor

datacap-bot bot commented Dec 23, 2024

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacedquqoul3r4m4qmgaw5cal44sspm7hfcsrcfzpqdun65odkjlzcvm

Address

f16wp5agwmggn5r23fqf5tq7xxu3hx6dgem7tnjry

Datacap Allocated

50TiB

Signer Address

f1sfffys4o2w64rdpd3alpmvpvj4ik6x2iyjsjmry

Id

aaf32523-9c1d-46f2-807b-d0fe8de1ed8b

You can check the status here https://filfox.info/en/message/bafy2bzacedquqoul3r4m4qmgaw5cal44sspm7hfcsrcfzpqdun65odkjlzcvm

Copy link
Contributor

datacap-bot bot commented Dec 23, 2024

Application is Granted

@lixue-one
Copy link
Author

thanks

Copy link
Contributor

datacap-bot bot commented Jan 1, 2025

Client used 75% of the allocated DataCap. Consider allocating next tranche.

@joshua-ne
Copy link
Owner

checker:manualTrigger

Copy link
Contributor

datacap-bot bot commented Jan 2, 2025

DataCap and CID Checker Report Summary1

Storage Provider Distribution

✔️ Storage provider distribution looks healthy.

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients2

✔️ No CID sharing has been observed.

Full report

Click here to view the CID Checker report.

Footnotes

  1. To manually trigger this report, add a comment with text checker:manualTrigger

  2. To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

@joshua-ne
Copy link
Owner

trigger:run_retrieval_test method=lassie sp_list=f02984331,f02883857,f02852273,f02973061,f02889193 client=f16wp5agwmggn5r23fqf5tq7xxu3hx6dgem7tnjry limit=10

@myfil512
Copy link

myfil512 commented Jan 2, 2025

miner_id retrieval_rate retrieval_success_counts retrieval_fail_counts
f02984331 NA 0 0
f02883857 NA 0 0
f02852273 NA 0 0
f02973061 NA 0 0
f02889193 NA 0 0

@joshua-ne
Copy link
Owner

Hi @lixue-one it seems that NONE of the SPs you have sent deals to matches the ones you claimed in the application form. Please explain.

@joshua-ne
Copy link
Owner

trigger:run_retrieval_test method=lassie sp_list=f02865213,f02887063,f02953218 client=f16wp5agwmggn5r23fqf5tq7xxu3hx6dgem7tnjry limit=10

@myfil512
Copy link

myfil512 commented Jan 2, 2025

miner_id retrieval_rate retrieval_success_counts retrieval_fail_counts
f02865213 100% 10 0
f02887063 100% 10 0
f02953218 100% 10 0

Copy link
Contributor

datacap-bot bot commented Jan 16, 2025

Application is Granted

@datacap-bot datacap-bot bot added granted and removed Refill labels Jan 16, 2025
Copy link
Contributor

datacap-bot bot commented Jan 18, 2025

Client used 75% of the allocated DataCap. Consider allocating next tranche.

@joshua-ne
Copy link
Owner

checker:manualTrigger

Copy link
Contributor

datacap-bot bot commented Feb 10, 2025

DataCap and CID Checker Report Summary1

Storage Provider Distribution

⚠️ 2 storage providers sealed more than 25% of total datacap - f02953218: 26.20%, f02927642: 25.31%

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients2

✔️ No CID sharing has been observed.

Full report

Click here to view the CID Checker report.

Footnotes

  1. To manually trigger this report, add a comment with text checker:manualTrigger

  2. To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

@joshua-ne
Copy link
Owner

Hi, I see that you claim to store 5 replicas, but only 4 SPs are provided and onboarded. Are we expecting more SPs any soon?

@lixue-one
Copy link
Author

Yes, dear, we found other SPs. If you can support us again, you will definitely see more progress. Thank you.

@lixue-one
Copy link
Author

lixue-one commented Feb 14, 2025

We want to add f03275720 (US) , we are ready, can you support us?

@joshua-ne
Copy link
Owner

Sure, please update your application form as we proceed. Thanks

Copy link
Contributor

datacap-bot bot commented Feb 14, 2025

Issue has been modified. Changes below:

(NEW vs OLD)

Please share a sample of the data: aws s3 ls --no-sign-request s3://noaa-wcsd-pds/

S3 Bucket: "noaa-wcsd-pds"
├── data
│ ├── processed
│ │ ├── SH1305
│ │ │ ├── 18kHz
│ │ │ │ ├── SaKe_2013-D20130522-T134850.csv
│ │ │ │ ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│ │ │ │ ├── ...
│ │ │ ├── 38kHz
│ │ │ │ ├── ...
│ │ │ ├── 70kHz
│ │ │ │ ├── ...
│ │ │ ├── 120kHz
│ │ │ │ ├── ...
│ │ │ ├── 200kHz
│ │ │ │ ├── ...
│ │ │ ├── bottom
│ │ │ │ ├── SaKe_2013-D20130522-T134850.csv
│ │ │ │ ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│ │ │ │ ├── ...
│ │ │ ├── multifrequency
│ │ │ │ ├── SaKe_2013-D20130522-T134850.csv
│ │ │ │ ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│ │ │ │ ├── ...
│ │ │ ├── ...
│ │ ├── GU1002
│ │ │ ├── ...
│ │ ├── AL0502
│ │ │ ├── ...
│ │ ├── ...
│ ├── raw
│ │ ├── Bell_M_Shimada
│ │ │ ├── SH1305
│ │ │ │ ├── EK60
│ │ │ │ │ ├── SaKe_2013-D20130623-T063450.raw
│ │ │ │ │ ├── SaKe_2013-D20130623-T064452.raw
│ │ │ │ │ ├── SaKe_2013-D20130623-T064452.bot vs aws s3 ls --no-sign-request s3://noaa-wcsd-pds/

S3 Bucket: "noaa-wcsd-pds"
├── data
│ ├── processed
│ │ ├── SH1305
│ │ │ ├── 18kHz
│ │ │ │ ├── SaKe_2013-D20130522-T134850.csv
│ │ │ │ ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│ │ │ │ ├── ...
│ │ │ ├── 38kHz
│ │ │ │ ├── ...
│ │ │ ├── 70kHz
│ │ │ │ ├── ...
│ │ │ ├── 120kHz
│ │ │ │ ├── ...
│ │ │ ├── 200kHz
│ │ │ │ ├── ...
│ │ │ ├── bottom
│ │ │ │ ├── SaKe_2013-D20130522-T134850.csv
│ │ │ │ ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│ │ │ │ ├── ...
│ │ │ ├── multifrequency
│ │ │ │ ├── SaKe_2013-D20130522-T134850.csv
│ │ │ │ ├── SaKe_2013-D20130522-T140446_to_SaKe2013-D20130522-T145239.csv
│ │ │ │ ├── ...
│ │ │ ├── ...
│ │ ├── GU1002
│ │ │ ├── ...
│ │ ├── AL0502
│ │ │ ├── ...
│ │ ├── ...
│ ├── raw
│ │ ├── Bell_M_Shimada
│ │ │ ├── SH1305
│ │ │ │ ├── EK60
│ │ │ │ │ ├── SaKe_2013-D20130623-T063450.raw
│ │ │ │ │ ├── SaKe_2013-D20130623-T064452.raw
│ │ │ │ │ ├── SaKe_2013-D20130623-T064452.bot

Please list the provider IDs and location of the storage providers you will be working with: f02865213 United States
f02887063 United States
f02927642 Vietnam
f02953218 Russia
f03275720 United States vs f02865213 United States
f02887063 United States
f02927642 Vietnam
f02953218 Russia
State: ChangesRequested vs Granted

@lixue-one
Copy link
Author

we do it.

Copy link
Contributor

datacap-bot bot commented Feb 14, 2025

Issue information change request has been approved.

Copy link
Contributor

datacap-bot bot commented Feb 14, 2025

Application is in Refill

Copy link
Contributor

datacap-bot bot commented Feb 14, 2025

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacebr2yivdx2blayhuylpeaohnnshgf6ixodrizejeijiun3nqxggxq

Address

f16wp5agwmggn5r23fqf5tq7xxu3hx6dgem7tnjry

Datacap Allocated

1500 TiB

Signer Address

f1sfffys4o2w64rdpd3alpmvpvj4ik6x2iyjsjmry

Id

ddcf97b3-70ab-4c6c-bedc-f02bdf612e75

You can check the status here https://filfox.info/en/message/bafy2bzacebr2yivdx2blayhuylpeaohnnshgf6ixodrizejeijiun3nqxggxq

Copy link
Contributor

datacap-bot bot commented Feb 14, 2025

Application is Granted

@datacap-bot datacap-bot bot added granted and removed Refill labels Feb 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants