Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DataCap Application] <AWS> - <International Neuroimaging Data-Sharing Initiative> #14

Open
1 of 2 tasks
penko101 opened this issue Aug 19, 2024 · 48 comments
Open
1 of 2 tasks

Comments

@penko101
Copy link

Data Owner Name

Child Mind Institute

Data Owner Country/Region

Afghanistan

Data Owner Industry

Life Science / Healthcare

Website

https://childmind.org/science/

Social Media Handle

https://x.com/ChildMindInst

Social Media Type

Slack

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

5PiB

Expected size of single dataset (one copy)

410TiB

Number of replicas to store

10

Weekly allocation of DataCap requested

512TiB

On-chain address for first allocation

f1l2aidxe4ogjrvc5rhs7dnltwnpuqle6rbcai7by

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

  • Use Custom Multisig

Identifier

No response

Share a brief history of your project and organization

We’re dedicated to transforming the lives of children and families struggling with mental health and learning disorders by giving them the help they need. We’ve become the leading independent nonprofit in children’s mental health by providing gold-standard evidence-based care, delivering educational resources to millions of families each year, training educators in underserved communities, and developing tomorrow’s breakthrough treatments.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

This bucket contains multiple neuroimaging datasets that are part of the International Neuroimaging Data-Sharing Initiative. Raw human and non-human primate neuroimaging data include 1) Structural MRI; 2) Functional MRI; 3) Diffusion Tensor Imaging; 4) Electroencephalogram (EEG) In addition to the raw data, preprocessed data is also included for some datasets.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

China

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

We would use fil-tool Lotus to convert files into car files. Also we would contact sps to take these files offline. We have enough harddisks to do this.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No response

Please share a sample of the data

aws s3 ls --no-sign-request s3://fcp-indi/

Confirm that this is a public dataset that can be retrieved by anyone on the Network

  • I confirm

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Sporadic

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, South America, Europe, Australia (continent)

How will you be distributing your data to storage providers

HTTP or FTP server, Shipping hard drives, Lotus built-in data transfer

How did you find your storage providers

Slack, Big Data Exchange, Partners

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

1. f01955030 Zhejiang
2. f02128256 Canada
3. f01844118 USA
4. f02211576 Guangdong
5. f02114994 Sichuan

How do you plan to make deals to your storage providers

Lotus client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

Copy link
Contributor

datacap-bot bot commented Aug 19, 2024

Application is waiting for allocator review

@1475Notary
Copy link
Owner

Hello @penko101 ,

I want to ask you some questions.

Is this your first time to apply? Have you ever applied for datacap from another allocator?
Do SPs you work with support retrieval?
Do you have enough token for your plan?
For bookkeeping, I'll keep a record of our communications. Do you accept that?

Also, please send your email to slack: 1475Notary, we need check your identity.

@penko101
Copy link
Author

@1475Notary
Yes, but we have experience about Fil+ and we've joined our partner's item.
Yes, we will command sps about retrieval.
I think sps have enough.
Sure!
image
I sent the message to your slack.
If you need more, please leave message to me!

@1475Notary
Copy link
Owner

@penko101 ok I saw it. If you need to change sps in the process, please put the lateset sp list in the application.

@penko101
Copy link
Author

@1475Notary Thank you notary. I will keep updating on my application. Please approve my application.

@1475Notary
Copy link
Owner

@penko101 Give you a chance and support your this round.

Copy link
Contributor

datacap-bot bot commented Aug 19, 2024

Datacap Request Trigger

Total DataCap requested

5PiB

Expected weekly DataCap usage rate

512TiB

DataCap Amount - First Tranche

512TiB

Client address

f1l2aidxe4ogjrvc5rhs7dnltwnpuqle6rbcai7by

Copy link
Contributor

datacap-bot bot commented Aug 19, 2024

DataCap Allocation requested

Multisig Notary address

Client address

f1l2aidxe4ogjrvc5rhs7dnltwnpuqle6rbcai7by

DataCap allocation requested

512TiB

Id

8fcea7e9-f909-4e2a-9f42-ca2f3971905d

Copy link
Contributor

datacap-bot bot commented Aug 19, 2024

Application is ready to sign

Copy link
Contributor

datacap-bot bot commented Aug 19, 2024

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacedxfdyobddwhik2uwe2dqikn4jmfqu6yae7e65tnr4eyvm2wpszv2

Address

f1l2aidxe4ogjrvc5rhs7dnltwnpuqle6rbcai7by

Datacap Allocated

512TiB

Signer Address

f17a4y7bgfvzl7m4dw3kbxnlfutpxfelfim6risnq

Id

8fcea7e9-f909-4e2a-9f42-ca2f3971905d

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedxfdyobddwhik2uwe2dqikn4jmfqu6yae7e65tnr4eyvm2wpszv2

Copy link
Contributor

datacap-bot bot commented Aug 19, 2024

Application is Granted

@penko101
Copy link
Author

@1475Notary Thank you notary.

@penko101
Copy link
Author

@1475Notary Dear notary, we add 2 sps for our storage.
f03157879 Los Angeles
f03159626 Dulles

Copy link
Contributor

datacap-bot bot commented Sep 23, 2024

Client used 75% of the allocated DataCap. Consider allocating next tranche.

@1475Notary
Copy link
Owner

checker:manualTrigger

Copy link
Contributor

datacap-bot bot commented Sep 29, 2024

DataCap and CID Checker Report Summary1

Storage Provider Distribution

✔️ Storage provider distribution looks healthy.

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients2

✔️ No CID sharing has been observed.

Full report

Click here to view the CID Checker report.

Footnotes

  1. To manually trigger this report, add a comment with text checker:manualTrigger

  2. To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

Copy link
Contributor

datacap-bot bot commented Sep 29, 2024

Application is in Refill

Copy link
Contributor

datacap-bot bot commented Dec 11, 2024

DataCap and CID Checker Report Summary1

Storage Provider Distribution

✔️ Storage provider distribution looks healthy.

⚠️ 26.32% of Storage Providers have retrieval success rate equal to zero.

⚠️ 73.68% of Storage Providers have retrieval success rate less than 75%.

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients2

✔️ No CID sharing has been observed.

Full report

Click here to view the CID Checker report.

Footnotes

  1. To manually trigger this report, add a comment with text checker:manualTrigger

  2. To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

@1475Notary
Copy link
Owner

@penko101 Can you explain about sps f03100008, f03100009? Their successful rate is 0 and you have said

We know the retrieval rate has decreased, so we changed these sps who do not support retrieval any more. I think it will be back to great result of report next time.

But you still sent data to them several days ago.

@penko101
Copy link
Author

@1475Notary Dear allocator, it is because that sps tried to repair their retrieval program several days ago. As explained by sps, due to the rules of spark, sp's latest retrieval rate need to have recently packaged the data in order to be displayed on spark. Then we resent some data to these sps. After finding that the retrieval failed again, we stopped sending data to these sps.

Repository owner deleted a comment from datacap-bot bot Dec 12, 2024
Repository owner deleted a comment from datacap-bot bot Dec 12, 2024
Repository owner deleted a comment from datacap-bot bot Dec 12, 2024
Repository owner deleted a comment from datacap-bot bot Dec 12, 2024
Repository owner deleted a comment from datacap-bot bot Dec 12, 2024
Repository owner deleted a comment from datacap-bot bot Dec 12, 2024
Repository owner deleted a comment from datacap-bot bot Dec 12, 2024
@penko101
Copy link
Author

We add one new partner f03241837-Shanghai.
image

@1475Notary Dear allocator, it is because that sps tried to repair their retrieval program several days ago. As explained by sps, due to the rules of spark, sp's latest retrieval rate need to have recently packaged the data in order to be displayed on spark. Then we resent some data to these sps. After finding that the retrieval failed again, we stopped sending data to these sps.

Copy link
Contributor

datacap-bot bot commented Dec 16, 2024

Client used 75% of the allocated DataCap. Consider allocating next tranche.

@1475Notary
Copy link
Owner

@penko101 OK. Please keep updating sp in time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants