Its weird. When I try to download manga from hentaifox, the download will initialize and appear to download a couple of files, but will ultimately fail, as seen below:
This is another error that I am seeing when trying to download manga from hentaifox:
hentaifox downloads are failing
-
- Posts: 10
- Joined: Mon Aug 07, 2017 5:18 am
Re: hentaifox downloads are failing
Ok. So I don't know why it is taking so long for a response concerning this issue, much less a fix. Between now and the time I created this original post, even with my full time job, I was able to write a powershell script which successfully does what Hdoujin downloader no longer can. I will post the code below in the event that it potentially helps you push an updated version of the application more quickly.
Code: Select all
function download-manga
{
param (
[Parameter(Position=0, Mandatory = $true)]
[String[]]$urls
)
Add-Type -AssemblyName "System.Windows.Forms, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
$regex = new-object Text.RegularExpressions.Regex -ArgumentList "\s{2,}"
Foreach ($url in $urls)
{
$web = invoke-webrequest $url
$folder = $web.ParsedHtml.title.substring(0, $web.ParsedHtml.title.Length - 12).replace("<", "").Replace("|", "").replace(">", "").replace("(", "").replace(")","").Replace(":", "").Replace("?", "").Replace("*", "").Replace("\", "").Replace("/", "").Replace("~", "").replace(",", "").Replace(".","").Trim()
If ($folder -match "\s{2,}")
{
$folder = $regex.Replace($folder, " ")
}
New-Item "$env:USERPROFILE\Downloads\Manga\$folder" -ItemType Directory -Force
$findp = $web.AllElements | Where-Object { $_.class -like "wrap" }
$ind = $findp[0].Innertext.IndexOf("Pages")
$sub = $findp[0].innerText.Substring($ind + 7)
$o = 0
$num = ""
While ($sub[$o] -match "\d{1}")
{
$num = $num + $sub[$o]
$o++
}
[int]$pagenum = $num
$pagenum
Foreach ($thing in $web.links)
{
If ($thing.href -match "/g/\d+/1/")
{
$num = $($thing.href.split("/"))[2]
Break
}
}
$isfault = $false
$tot = $pagenum + 1
For ($j = 1; $j -lt $tot; $j++)
{
$page = Invoke-WebRequest "https://hentaifox.com/g/$num/$j/"
$split = $page.Images[0].src.split("/")
Invoke-WebRequest $page.Images[0].src -OutFile "$env:USERPROFILE\Downloads\Manga\$folder\$($split[-1])"
}
}
}
Last edited by demondante on Tue Dec 10, 2019 2:26 am, edited 1 time in total.
Re: hentaifox downloads are failing
Instead of creating a duplicate thread, I will post in this one.
I am having the exact same problem.
I am having the exact same problem.
-
- Posts: 1
- Joined: Sun Dec 15, 2019 12:47 am
Re: hentaifox downloads are failing
Seeing as the relevant information is obscured in the picture posted, the exact issue is that an "Error (Pages Missing)" error appears after 10 pages, so if something is 10 pages or less, it will complete, if it's 11 or 1000 it will not.
- Webscratcher
- Posts: 5
- Joined: Sun Nov 25, 2018 11:31 am
Re: hentaifox downloads are failing
Same problem here.
For stuff like this the downloader (IMHO) would really need an expert debug mode* where it really writes down what it's trying to do in the background (Just enough to say in what step its currently in, what it expects and what it got or something like that).
That could maybe save Squidy some time trying to find the problem and maybe even for some problems Squidy can't reproduce (and for us/me(?) to know where the problem is as well. Some like me a curious about that ).
Though it seems like for some reason (with the way the downloader collects its page-urls) HFox somehow gives out wrong links to it? Or maybe after 10 none at all?
*: At least the current debug mode doesn't seem to be very talkative about what the downloader is doing. The only thing it ever wrote into the debug.log was a random I/O error.
For stuff like this the downloader (IMHO) would really need an expert debug mode* where it really writes down what it's trying to do in the background (Just enough to say in what step its currently in, what it expects and what it got or something like that).
That could maybe save Squidy some time trying to find the problem and maybe even for some problems Squidy can't reproduce (and for us/me(?) to know where the problem is as well. Some like me a curious about that ).
Though it seems like for some reason (with the way the downloader collects its page-urls) HFox somehow gives out wrong links to it? Or maybe after 10 none at all?
*: At least the current debug mode doesn't seem to be very talkative about what the downloader is doing. The only thing it ever wrote into the debug.log was a random I/O error.
Who is online
Users browsing this forum: Bing [Bot] and 1 guest