hentaifox downloads are failing

Run into a problem? Report it here!
Post Reply
demondante
Posts: 10
Joined: Mon Aug 07, 2017 5:18 am

hentaifox downloads are failing

Post by demondante » Sun Nov 24, 2019 9:29 pm

Its weird. When I try to download manga from hentaifox, the download will initialize and appear to download a couple of files, but will ultimately fail, as seen below:

Image

This is another error that I am seeing when trying to download manga from hentaifox:

Image

demondante
Posts: 10
Joined: Mon Aug 07, 2017 5:18 am

Re: hentaifox downloads are failing

Post by demondante » Fri Nov 29, 2019 9:40 pm

Ok. So I don't know why it is taking so long for a response concerning this issue, much less a fix. Between now and the time I created this original post, even with my full time job, I was able to write a powershell script which successfully does what Hdoujin downloader no longer can. I will post the code below in the event that it potentially helps you push an updated version of the application more quickly.

Code: Select all

function download-manga
{
param (
    [Parameter(Position=0, Mandatory = $true)]
    [String[]]$urls
)
Add-Type -AssemblyName "System.Windows.Forms, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
$regex = new-object Text.RegularExpressions.Regex -ArgumentList "\s{2,}"
Foreach ($url in $urls)
{
$web = invoke-webrequest $url
$folder = $web.ParsedHtml.title.substring(0, $web.ParsedHtml.title.Length - 12).replace("<", "").Replace("|", "").replace(">", "").replace("(", "").replace(")","").Replace(":", "").Replace("?", "").Replace("*", "").Replace("\", "").Replace("/", "").Replace("~", "").replace(",", "").Replace(".","").Trim()
If ($folder -match "\s{2,}")
{
    $folder = $regex.Replace($folder, " ")
}
New-Item "$env:USERPROFILE\Downloads\Manga\$folder" -ItemType Directory -Force
$findp = $web.AllElements | Where-Object { $_.class -like "wrap" }
        $ind = $findp[0].Innertext.IndexOf("Pages")
        $sub = $findp[0].innerText.Substring($ind + 7)
        $o = 0
        $num = ""
        While ($sub[$o] -match "\d{1}")
        {
            $num = $num + $sub[$o]
            $o++
        }
		[int]$pagenum = $num
$pagenum
Foreach ($thing in $web.links)
{
    If ($thing.href -match "/g/\d+/1/")
    {
        $num = $($thing.href.split("/"))[2]
        Break
    }
}
$isfault = $false
$tot = $pagenum + 1
For ($j = 1; $j -lt $tot; $j++)
{
$page = Invoke-WebRequest "https://hentaifox.com/g/$num/$j/"
$split = $page.Images[0].src.split("/")
Invoke-WebRequest $page.Images[0].src -OutFile "$env:USERPROFILE\Downloads\Manga\$folder\$($split[-1])"
}
}
}
Last edited by demondante on Tue Dec 10, 2019 2:26 am, edited 1 time in total.

DarkH
Posts: 1
Joined: Sat Nov 30, 2019 5:17 am

Re: hentaifox downloads are failing

Post by DarkH » Sat Nov 30, 2019 5:48 am

Instead of creating a duplicate thread, I will post in this one.
I am having the exact same problem.

comes-luts
Posts: 1
Joined: Sun Dec 15, 2019 12:47 am

Re: hentaifox downloads are failing

Post by comes-luts » Sun Dec 15, 2019 12:49 am

Seeing as the relevant information is obscured in the picture posted, the exact issue is that an "Error (Pages Missing)" error appears after 10 pages, so if something is 10 pages or less, it will complete, if it's 11 or 1000 it will not.

User avatar
Webscratcher
Posts: 5
Joined: Sun Nov 25, 2018 11:31 am

Re: hentaifox downloads are failing

Post by Webscratcher » Sat Jan 04, 2020 8:48 pm

Same problem here.

For stuff like this the downloader (IMHO) would really need an expert debug mode* where it really writes down what it's trying to do in the background (Just enough to say in what step its currently in, what it expects and what it got or something like that).

That could maybe save Squidy some time trying to find the problem and maybe even for some problems Squidy can't reproduce (and for us/me(?) to know where the problem is as well. Some like me a curious about that :) ).


Though it seems like for some reason (with the way the downloader collects its page-urls) HFox somehow gives out wrong links to it? Or maybe after 10 none at all?


*: At least the current debug mode doesn't seem to be very talkative about what the downloader is doing. The only thing it ever wrote into the debug.log was a random I/O error.

Post Reply

Who is online

Users browsing this forum: Bing [Bot] and 1 guest