-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Search should use underlying repo #51
Comments
@srossross would something like this work for this purpose? |
yes, that looks good |
@srossross please take a look at the linked PR, I did some testing and provided my test results along with it |
@karamba228 each search should 'search the underlying repo' eg the pypi search should fetch from pypi.org |
So we are not using the DuckDB tables anymore? |
no, we are not |
@srossross the PR is updated. For pypi packages, I had to use Libraries.io because pypi API does not support search. So it does not return a list of packages with the similar name. As for the npm and conda, they reach out directly to their respective APIs. with libraries.io you need an API key to create requests. you will need to add a environment variable to the .env file |
moving away from the in-advance scraper model, the search feature should be broken out into pypi,conda, npm search pages
that search the underlying repo
The text was updated successfully, but these errors were encountered: