If there's a feature you think HDoujin is missing, share your ideas here!
-
RayZor
- Posts: 6
- Joined: Sat Oct 27, 2018 6:04 pm
Post
by RayZor » Sun Oct 27, 2019 5:03 am
So I was trying to download a users gallery on D-art and wasn't having much luck until I noticed that it has rss feeds for all the gallery's, for example:
Code: Select all
https://backend.deviantart.com/rss.xml?q=gallery%3A{username}
Just replace the {username} and you'll see it in action.
From the XML that gets returned the key lines are:
The full rez image:
Code: Select all
<media:content url="https://images-wixmp-{*****...long security url.....******----}" height="xxx" width="xxx" medium="image"/>
Reg exp to extract:
/<media:content\s+[^<]*?url=[\'\"](?<value>[^<]*?)[\'\"]/g
The link to the next 60 images:
Code: Select all
<atom:link rel="next" href="https://backend.deviantart.com/rss.xml?type=deviation&q=gallery%3A{username}&offset=60"/>
Reg exp to extract:
/<atom:link rel="next"\s+[^<]*?href=[\'\"](?<value>[^<]*?)[\'\"]/g
Is there a way to take this information and apply it in the downloader to enable ripping of full galleries?
-
Squidy
- Site Admin
- Posts: 1270
- Joined: Fri Mar 10, 2017 9:28 pm
-
Contact:
Post
by Squidy » Tue Oct 29, 2019 4:03 am
Cool find; I appreciate the amount of detail you've gone into with this. I've implemented this in the latest release as quick workaround to get DeviantArt kinda working again.
One caveat, though: The images you get through means won't be at their original size; they're still downscaled a little bit (at least for large images). That said, this is a step in the right direction, because it at least makes crawling through a user's gallery simpler. I'll look into fixing it for a future release so that the original size image URLs are pulled instead.
-
RayZor
- Posts: 6
- Joined: Sat Oct 27, 2018 6:04 pm
Post
by RayZor » Mon Nov 04, 2019 9:42 pm
Yep, that works a bloody treat - thankyou
Who is online
Users browsing this forum: No registered users and 0 guests