Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

4th Community Review of 1475 EFil+ Allocator #241

Closed
1475Notary opened this issue Dec 4, 2024 · 6 comments
Closed

4th Community Review of 1475 EFil+ Allocator #241

1475Notary opened this issue Dec 4, 2024 · 6 comments
Assignees
Labels
DataCap - Throttled Refresh Applications received from existing Allocators for a refresh of DataCap allowance

Comments

@1475Notary
Copy link

First Review #57
Second Review #143
Third Review #190

Allocator Compliance Report: https://compliance.allocator.tech/report/f03018491/1733274700/report.md

5PiBs DataCap awarded in 4th round

example 1: 1475Notary/1475-Allocator#14 - 1PiB
example 2: 1475Notary/1475-Allocator#19 - 3PiB

@filecoin-watchdog filecoin-watchdog added Refresh Applications received from existing Allocators for a refresh of DataCap allowance Awaiting Community/Watchdog Comment DataCap Refresh requests awaiting a public verification of the metrics outlined in Allocator App. labels Dec 4, 2024
@Kevin-FF-USA Kevin-FF-USA self-assigned this Dec 6, 2024
@filecoin-watchdog
Copy link
Collaborator

@1475Notary
Allocator Application
Compliance Report
1st Review
2nd Review
3rd Review
1st Review score: 2.5PiB
2nd Review score: 5PiB
3rd Review score: 5PiB

5 PiB granted to existing clients:

Client Name DC status
ChildMind Institute 1PiB Existing
NationalCenter For Atmospheric Research 3PiB Existing

ChildMind Institute

The client stated they stopped cooperating with low-retrieval SPs and replaced them. However, comparing the reports from 2024-10-17 and 2024-12-11, I see only one SP with poor retrieval in the first report (f03159626) that has not sealed any new data since then. In contrast, two SPs with very low retrieval (f03100008, f03100009) from the previous report are still sealing new data, both showing a 0% retrieval rate.
Although the allocator did not list these two SPs in the newest SP list, this contradicts what the latest report shows.
Additionally, I noticed that some SPs with very good retrieval rates (f01084413, f01660795) were excluded from the list. I do not understand the logic behind including or excluding certain SPs from further cooperation. Please provide an explanation.
There are already 19 replicas created.

National Center for Atmospheric Research

The client updated the SP list on 2024-12-02, but this new list does not include all the SPs currently in use, such as:
f03231666
f03241837
f01975299
f02826588
f02851143
f03066836

There are already 20 replicas created, and 30.08% of deals involve data replicated across fewer than 4 storage providers. Please explain why additional replicas are needed. How did the client and/or allocator verify that the data was irretrievable, and how will you ensure that this data can now be retrieved?

@filecoin-watchdog filecoin-watchdog added Awaiting Response from Allocator If there is a question that was raised in the issue that requires comment before moving forward. and removed Awaiting Community/Watchdog Comment DataCap Refresh requests awaiting a public verification of the metrics outlined in Allocator App. labels Dec 11, 2024
@1475Notary
Copy link
Author

Hello @filecoin-watchdog ,

1475Notary/1475-Allocator#14

I think there's some misunderstanding on that content. I've seen clients leave comments about updating sps meaning which sp they plan to add, rather than showing all the sps they are currently working with. The sp that had already been disclosed they then did not disclose again in the list of added sp.
image
image

They have given he explanation about that two sps.

two SPs with very low retrieval (f03100008, f03100009) from the previous report are still sealing new data, both showing a 0% retrieval rate.

image

1475Notary/1475-Allocator#19

image
image

This is the explanation from this client. We can understand about additional replicas in the report. This client has changed sps and added new sps, so that the amount of all sps in their list are more than 8.

We often use spark dashboard as the first step to check retrieval. We also use boost retrieve program for checking retrieval. Two ways will lead to more reliable results. If the retrieval result is poor, we would communicate with clients.

@filecoin-watchdog
Copy link
Collaborator

@1475Notary

I think there's some misunderstanding on that content. I've seen clients leave comments about updating sps meaning which sp they plan to add, rather than showing all the sps they are currently working with. The sp that had already been disclosed they then did not disclose again in the list of added sp.

I'm referring to this message. The client said:

"Dear allocator, we have added sps and replaced the old sps to improve retrieval rate."

This list doesn't have the following SPs enlisted: (f01084413, f01660795). Hence my assumption.
To avoid this happening in the future, it might be helpful to keep the SP list updated in the form regularly. This way, we can ensure everything stays aligned.

@1475Notary
Copy link
Author

@filecoin-watchdog OK. I got it. I'll be watching to keep my clients updated on this. Thank you for your review!

@galen-mcandrew
Copy link
Collaborator

Overall mixed compliance and onboarding. Some specific areas that need to be addressed:

  • extraneous and redundant replicas across the Filecoin ecosystem. Specifically this is not in compliance with your application or the program goals: "20 replicas created, and 30.08% of deals involve data replicated across fewer than 4 storage providers"
  • inaccurate or incomplete SP lists that are not updated
  • inconsistent and low retrieval rates

We are seeing stronger compliance across these areas with similar (and even smaller) clients/datasets/allocators. If your clients are not able to meet the standards that you require in your application, then you should not continue awarding more DataCap. These clients should be able to accurately grow trust over time, and the explanation that they are "busy" and that "sealing rates are slow" does not justify noncompliance.

Given these flags, we are requesting an additional 2.5PiB of DataCap, with the expectation that the allocator will hold their clients to their standards.

@1475Notary
Copy link
Author

@galen-mcandrew ok. I learned these points. We will stop working with clients who do not fulfill the requirements.

@Kevin-FF-USA Kevin-FF-USA added DataCap - Throttled and removed Awaiting Response from Allocator If there is a question that was raised in the issue that requires comment before moving forward. labels Jan 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
DataCap - Throttled Refresh Applications received from existing Allocators for a refresh of DataCap allowance
Projects
None yet
Development

No branches or pull requests

4 participants