-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DataCap Refresh] <4th> Review of <IPFSTT> #273
Comments
NREL National Solar Radiation Database
|
@filecoin-watchdog Thank you the guidance.
![]() We will work with clients to ensure more decentralized distribution of DCs.
NREL National Solar Radiation Database
|
Referring to the update of the list of SPs, it should be updated each time in the original application so as to keep the yen place with the current list of them. Regarding geodiversification, remember that you can't create rules to deny them later. So writing one into the allocator's regulations is binding in this sense. You also didn't address the question about VPN detection tools that were written into the application of your allocator, are they implemented ? |
Hi @nicelove666 Thanks for submitting this application for refresh. Warmly, |
https://www.ipqualityscore.com/user/search boost provider storage-ask f03220172 ![]() boost provider storage-ask f01999119 ![]() boost provider storage-ask f03286667 ![]() boost provider storage-ask f03312989 ![]() boost provider storage-ask f03220172 ![]() boost provider storage-ask f03282101 ![]() boost provider storage-ask f03253580 ![]() boost provider storage-ask f01999119 ![]() |
@nicelove666 can you summarize the evidence you are providing here? It appears to be investigations into potential VPN usage, but it would be helpful to get a summary from you, so that I do not misinterpret. From the above conversation and my investigation, these are some areas raised:
We are requesting an additional 20PiB of DataCap for this pathway. |
@galen-mcandrew thank you
https://www.ipqualityscore.com is a detection tool that quickly identifies suspicious activity across common types of fraud vectors such as bad bots, fraudulent transactions, account takeover fraud, proxies & VPNs, fake identities, and account opening abuse. ![]() There is a "fraud score" , we can infer whether the SP uses a VPN based on the size of the fraud score. A fraud score of 0 indicates that the SP does not use a VPN. According to the website: If the score is over 75, the SP may be using VPN. |
We will strive to require SP to be distributed in more areas
We agree with the proposal here.
Well, we will do it. |
Friendly update on this refresh. We are currently in the process of moving to a Metaallocator. In order for the tooling to work correctly an allocator can only use the DataCap balance they received through direct allocation from Root Key Holders, >>> or the DataCap received through Metaallocator. As a result, some of the metrics pages like Datacapstats, Pulse and other graphs might be a little confused during this update. You will not lose any of the DataCap, but you will see that your refresh is amount of DC from refresh + remaining DC an allocator has left. No action needed on your part, just a friendly note to thank you for your contributions and patience, and you may notice changes in your DataCap balance while the back end is updated. |
Basic info
Paste your JSON number: [yes]
Allocator verification: [1006]
Current allocation distribution
I. NREL National Solar Radiation Database
II. Dataset Completion
s3://nrel-pds-nsrdb/ (420.8 TiB)
s3://nrel-pds-nsrdb/v3/(47.5 TiB)
s3://nrel-pds-nsrdb/v3/tmy/(4.0 TiB)
s3://nrel-pds-nsrdb/v3/tdy/(4.0 TiB)
s3://nrel-pds-nsrdb/v3/tgy/(4.0 TiB)
s3://nrel-pds-nsrdb/v3/puerto_rico/(114.6 GiB)
s3://nrel-pds-nsrdb/conus/(48.2 TiB)
s3://nrel-pds-nsrdb/full_disc/(81.8 TiB)
s3://nrel-pds-nsrdb/meteosat/(16.1 TiB)
s3://nrel-pds-nsrdb/himawari/(189.1 TiB)
s3://nrel-pds-nsrdb/india/(515.3 GiB)
III. Does the list of SPs provided and updated in the issue match the list of SPs used for deals?
Yes(The client disclosed the SPs in advance and amended the application form.)
IV. How many replicas has the client declared vs how many been made so far:
9 vs 9
V. Please provide a list of SPs used for deals and their retrieval rates
I. HyperAI
II. Dataset Completion
Xunlei BitTorrent
magnet:?xt=urn:btih:98F8E7FDDC919C0573AD5C99C31DE53D6866E071
magnet:?xt=urn:btih:ABB2DC586B2955AF86BB26FBFADEF227B0BF8DA7
magnet:?xt=urn:btih:AEFC07D18C9836EC5D019FDD5862EBEF770EBDC7
magnet:?xt=urn:btih:D00A14D1A6640DB8FDCDC6689E910F5D3BB07286
magnet:?xt=urn:btih:6DFD3D3B257C54FF86FE64D57AF45EB612DC402C
magnet:?xt=urn:btih:95A27F0CA2429022E0909B0B0BE3CF3FF13BEFC8
III. Does the list of SPs provided and updated in the issue match the list of SPs used for deals?
Yes(The client disclosed the SPs in advance. then, We advice them modify the SP to the latest)
IV. How many replicas has the client declared vs how many been made so far:
9 vs 9
V. Please provide a list of SPs used for deals and their retrieval rates
I. Sinergise
II. Dataset Completion
s3://sentinel-cogs/(16.4 PiB)
s3://sentinel-cogs-inventory/(3.4 TiB)
III. Does the list of SPs provided and updated in the issue match the list of SPs used for deals?
NO(The client disclosed 5 SPs.the list of SPs enlisted in the CID report:9 SPs)Although the application form did not keep the latest SP, it was disclosed in advance on GitHub.
IV. How many replicas has the client declared vs how many been made so far:
6 vs 9
V. Please provide a list of SPs used for deals and their retrieval rates
Allocation summary
Our goals for this round:1. Support old clients 2. Find new data sets 3. Ask clients which part they have stored
Of the three new LDNs supported in this round, one is a new dataset, two are new at https://allocator.tech/, and two are duplicated at https://github.com/filecoin-project/filecoin-plus-large-datasets/issues. We will strengthen the requirements in the future.
Yes, we pay attention to the progress of all clients and treat them equally. When SP retrieval is slow, data backup is unreasonable, or new SPs are added, we ask questions and suspend support. We recently let clients to provide more information, individuals provide ID cards, companies provide business licenses. this has been opposed by clients. Only one client sent us an email (we had sent it to the governance team). The client's non-cooperation makes us suspicious. Maybe we should be more tolerant?
Understand their technical solutions、Regularly generate cid reports to follow up on data distribution、Ask them which part of the data is stored、Ask them to find new datasets
We are actively looking for enterprise clients and new data sets.
Yes
Yes
The text was updated successfully, but these errors were encountered: