-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deviantart 404 problem #488
Comments
They just changed a bunch of stuff. It might not be relevant though. |
A (temporary) solution that I just found is to set the originals setting in gallery-dl.conf to false, like so: |
Can confirm. Everything worked fine for the last couple days until some point late at night yesterday. The token it gets for the URL seems to be invalid, maybe? I'm using this configuration for deviantart:
Tried with and without refresh token, then re-ran gallery_dl oauth:deviantart just to see if a new refresh-token would help, but no. Noticed it mainly failing for stash items, but on further investigation, quite a lot of regular deviations are failing too. Examples: https://www.deviantart.com/shnider/art/Twilight-Sharkle-464517039 ...Worked when I downloaded them yesterday, now they result in a 404:
|
/addlabel: site-change |
Download links like
now only redirect to the real download URL if you are logged in. Clicking the download button on a deviation page without an active DeviantArt session (in a private browser window, for example) will redirect you to the login page, and opening any download links, like the ones above, will give a 404 page. But this all works when logged in. On a more positive note: Download links from their OAuth API got fixed and work again, so I will most likely revert a bunch of changes made during #436 and everything except scraps and direct deviation links should work again. Possible solutions for right now:
|
Is it possible at all to get scraps and direct links to work again? A lot of the art I've been downloading has been scraps (I'm not sure how much of it was direct links). |
This commit (partially) reverts 27b5b24, 94eb7c6, and a437e78. Download URLs from the 'extended_fetch' endpoint are now only usable for logged in users, while those from the respective OAuth API endpoint are working again. Everything except scraps and direct deviation links should be fixed, and those two categories will work with exported cookies. (#488) TODO: - "native" login with --username and --password - better handling of internally stored cookies
359c3bc should fix original image downloads for everything except the two categories that don't use the OAuth API (scraps and direct links), but those will work with exported DeviantArt cookies. #445 had some discussion about cookie usage on DeviantArt, if anyone is interested. Next on the To-Do list: login support with |
Could you, please, explain how to revert to a previous version? Upgrade commands seem to work only towards later versions. It still shows 1.11.2 after switching the executable in PATH for the older one and running the upgrade command. Again, sorry for dumb questions, I guess it's common knowledge around here. |
You can tell
|
Both ways work like a charm. Thank you very much. |
I'm having some problems with the dA plug-in, too, which seem to be related to the changes. At first I was using dA with a public access token and default settings, which resulted in temporary filenames for images with special characters (like a colon) in the title. Now I'm logged in via OAuth and try to use a custom filename format: Does the API no longer provide the original file extension? Not using a custom filename format works, but that means being back at the initial problem again. |
Hmm, you are right, Also:
What do you mean by that? Could you provide an example DeviantArt link where this happens? |
Example: https://www.deviantart.com/evelar/art/Subject-BA-XX81-822258460 ("Subject: BA-XX81"), which 404s btw., single images seem to be broken or OAuth is acting up, I don't know. (Edit: I'm using the current dev. version.) |
Single images as well as Scraps aren't possible to fetch with the OAuth API. For single images you'd need their old UUID which in the new Eclipse interface is no longer available, and the OAuth API has no (documented) endpoint for scraps. Both therefore have to use the new public endpoints, which require you to be logged in to download the original image versions. dA's filename with its numeric deviation ID in front would be |
One thing, though: XFS is perfectly fine with colons in filenames, so it must be another issue. But I'll just go with path-restrict for now. |
How do you do this? I've tried adding the following line to the Deviantart settings to give it access to the browser cookies
but it doesn't work. |
It should be
where Specifying the location where Chrome stores its cookies won't work, because gallery-dl doesn't know how to read cookies from such a file. You'll have to use a browser addon that can export your cookies as a cookies.txt file. I'm personally using export-cookies-txt in Firefox, but I can't give you any recommendations for Chrome. Edit: Downloading single images and scraps currently wont work, regardless of your cookie setup: #505 |
Hello. When trying to grab any gallery I get the same type of message
[downloader.http][warning] '404 Not Found' for 'https://www.deviantart.com/download/739386335/dc87m5b-eddbd940-e7c0-481b-a65c-9b599d403e8b.jpg?token=bd5f065c2f02e4e05baa47e026b2331054f71b2d&ts=1574766971'
[download][error] Failed to download deviantart_739386335_Christian Maverick [Commission].jpg
Funnily enough, It worked perfectly just yesterday. I even tested it on the same gallery.
Sorry in advance for being a total and utter noob, but I will try to provide any information necessary. I am using the Win version, Python installed, gallery-dl version 1.11.2-dev. The command used is
py -3 -m gallery_dl -u [USERNAME] -p [PASSWORD] "URL"
The text was updated successfully, but these errors were encountered: