Unable to download <SOLVED>

Sector11

I have always had problems with one.com, regardless of the day.

Clicking on the link and downloading in Iceweasel is a 100% failure rate for me.
And I'll add, at times just logging in for me is a pain - I know you've heard that before - but there it is again, not just Mondays.

@ dizzie: wget -c = 'continue' and may or may not be a good idea:
       -c
       --continue
           Continue getting a partially-downloaded file.  This is useful when
           you want to finish up a download started by a previous instance of
           Wget, or by another program.  For instance:

                   wget -c [url]ftp://sunsite.doc.ic.ac.uk/ls-lR.Z[/url]

           If there is a file named ls-lR.Z in the current directory, Wget
           will assume that it is the first portion of the remote file, and
           will ask the server to continue the retrieval from an offset equal
           to the length of the local file.

           Note that you don't need to specify this option if you just want
           the current invocation of Wget to retry downloading a file should
           the connection be lost midway through.  This is the default
           behavior.  -c only affects resumption of downloads started prior to
           this invocation of Wget, and whose local files are still sitting
           around.

           Without -c, the previous example would just download the remote
           file to ls-lR.Z.1, leaving the truncated ls-lR.Z file alone.

           Beginning with Wget 1.7, if you use -c on a non-empty file, and it
           turns out that the server does not support continued downloading,
           Wget will refuse to start the download from scratch, which would
           effectively ruin existing contents.  If you really want the
           download to start from scratch, remove the file.

           Also beginning with Wget 1.7, if you use -c on a file which is of
           equal size as the one on the server, Wget will refuse to download
           the file and print an explanatory message.  The same happens when
           the file is smaller on the server than locally (presumably because
           it was changed on the server since your last download
           attempt)---because "continuing" is not meaningful, no download
           occurs.

           On the other side of the coin, while using -c, any file that's
           bigger on the server than locally will be considered an incomplete
           download and only "(length(remote) - length(local))" bytes will be
           downloaded and tacked onto the end of the local file.  This
           behavior can be desirable in certain cases---for instance, you can
           use wget -c to download just the new portion that's been appended
           to a data collection or log file.

           However, if the file is bigger on the server because it's been
           changed, as opposed to just appended to, you'll end up with a
           garbled file.  Wget has no way of verifying that the local file is
           really a valid prefix of the remote file.  You need to be
           especially careful of this when using -c in conjunction with -r,
           since every file will be considered as an "incomplete download"
           candidate.

           Another instance where you'll get a garbled file if you try to use
           -c is if you have a lame HTTP proxy that inserts a "transfer
           interrupted" string into the local file.  In the future a
           "rollback" option may be added to deal with this case.

           Note that -c only works with FTP servers and with HTTP servers that
           support the "Range" header.


I have to rethink that one for my aliases, maybe:
       -t number
       --tries=number
           Set number of retries to number.  Specify 0 or inf for infinite
           retrying.  The default is to retry 20 times, with the exception of
           fatal errors like "connection refused" or "not found" (404), which
           are not retried.

would be better but I've never needed more that 4 - once I think, 3 - a few and 2 - very common.

At the moment Iceweasel is downloading it from Google Drive, and true to form it is using 100% of my band width. I'm writing this in medit while it does that.

And again - Google sees my IP Addy - Argentina - Why do I get FRENCH ?
That's a rhetorical question, not looking for or expecting an answer.

Started at 16:14 local
16:19:?? - 120.5 MB - ±40min I'm guessing with a 3GB connection
16:28:11 - 300MB (I waited for it)
16:45:21 - Finished


You can see a failed download at 229.7MB, still there as I was testing

Going to try 'wg2' on the Google Drive link and see what that does.
Stay Home

VastOne

Any download from anywhere will use 100% bandwidth unless you use something like wget like you have outlined

To me 5 minutes of 100% bandwidth usage is not an issue

What is the issue is failure ...  All I need to know is that all downloads are completing at 100% from the Google Drive download
VSIDO      VSIDO Change Blog    

    I dev VSIDO

VastOne

I have not heard any other reported issues with one.com since the very beginning

If anyone else has issues at anytime, please report them
VSIDO      VSIDO Change Blog    

    I dev VSIDO

Sector11

  I know, I'm the only person ... maybe the Argentinian connection.  Who knows.

Also I cannot get wget to work with Google Drive, attachments should open in your browser. (now they are Spanish - go figure)

01 Apr 13 | 17:30:49 ~
         $ wg2 https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?pli=1
--2013-04-01 17:31:12--  https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?pli=1
Resolving docs.google.com (docs.google.com)... 173.194.42.0, 173.194.42.1, 173.194.42.5, ...
Connecting to docs.google.com (docs.google.com)|173.194.42.0|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: 'edit?pli=1'

    [ <=>                                                                                    ] 14,377      --.-K/s   in 0.07s   

2013-04-01 17:31:13 (213 KB/s) - 'edit?pli=1' saved [14377]


01 Apr 13 | 17:31:13 ~
         $ wg2 https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
--2013-04-01 17:31:48--  https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
Resolving docs.google.com (docs.google.com)... 173.194.42.6, 173.194.42.7, 173.194.42.8, ...
Connecting to docs.google.com (docs.google.com)|173.194.42.6|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: 'edit?usp=sharing'

    [ <=>                                                                                    ] 14,380      --.-K/s   in 0.01s   

2013-04-01 17:31:49 (1.19 MB/s) - 'edit?usp=sharing' saved [14380]


01 Apr 13 | 17:31:49 ~
         $ wget https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
--2013-04-01 17:32:16--  https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
Resolving docs.google.com (docs.google.com)... 173.194.42.8, 173.194.42.2, 173.194.42.0, ...
Connecting to docs.google.com (docs.google.com)|173.194.42.8|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: 'edit?usp=sharing.1'

    [ <=>                                                                                    ] 14,380      --.-K/s   in 0.05s   

2013-04-01 17:32:17 (260 KB/s) - 'edit?usp=sharing.1' saved [14380]


01 Apr 13 | 17:32:17 ~
         $
Stay Home

VastOne

The file is still the same in the wget instructions, it is coming from vsido.org and not the Google Drive site

Since that one works fine, I have left it the same
VSIDO      VSIDO Change Blog    

    I dev VSIDO

jefsview

Sorry, I left the computer for a few hours.

The download was faster, but I didn't time it. Direct download always takes longer than wget. The from Google-drive completed in less time. I'm just used to waiting and waiting for a dl to complete, but it was done before I realized it. And complete.

-- Jeff

VastOne

No worries jefsview...

Thanks for the test and the report back, it is appreciated!  8)
VSIDO      VSIDO Change Blog    

    I dev VSIDO

lwfitz

#22
I am home today and have multiple networks from multiple ISP's.....  ;D ;D...... SO I am now going to test wget and direct download on all networks. Ill update and maybe we can try to get this figured out.


Ok...... I found a a fix for wget timing out and then terminating.

Maybe someone else can test this also

wget -c --tries=0 http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso


The "tries=0" command tells wget to retry the connection to the download infinitely......

so if you wanted it to retry 5 times it would be
tries=5

and the and the -c command tells it to continue with the partial download.


Edit:

Oops it looks like Sector11 hit on this earlier but I missed his post.
Don't Be A Dick!

VastOne

VSIDO      VSIDO Change Blog    

    I dev VSIDO

lwfitz

#24
^thanks. Im testing now


Ok direct download from http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso on my main network failed but wget with

wget -c --tries=0 http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso

worked like a charm on my main network which is a wireless 4g Clear connection.


Both direct and wget are downloading on my Verizon dsl network.


Ok so both downloads on Verizon errored but wget with the retry option continued on

luke@G73JH-VSIDO:~$  wget -c --tries=0 http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
--2013-04-01 15:54:04--  http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
Resolving vsido.org (vsido.org)... 46.30.211.55
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 670040064 (639M) [application/x-iso9660-image]
Saving to: 'vsido_v1-2_3.8-3_Kernel.iso'

18% [======>                                ] 122,444,003  230KB/s   in 10m 1s

2013-04-01 16:04:38 (199 KB/s) - Connection closed at byte 122444003. Retrying.

--2013-04-01 16:04:39--  (try: 2)  http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 670040064 (639M), 547596061 (522M) remaining [application/x-iso9660-image]
Saving to: 'vsido_v1-2_3.8-3_Kernel.iso'

20% [+++++++                                ] 136,576,148  349KB/s  eta 25m 23s



Now gong to try and test to see why its erroring like that
Don't Be A Dick!

lwfitz

#25
Very odd.... now direct download from the original link works fine but wget (without the retry option) still terminates
but there arent any dropped packages or errors at all. The connection just terminates.


Running for 27 mins, 51 secs, since 2013-04-01 23:31:24 UTC+0000.
Total 1,167,856,833 bytes, in 1,267,350 packets. (1,267,514 captured, 0 dropped)
Don't Be A Dick!

VastOne

^ So strange, and I can never get any one of them to fail
VSIDO      VSIDO Change Blog    

    I dev VSIDO

lwfitz

Heres how to wget from Google drive


First we need the actual url for the ISO which in this case happens to be

http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso

Now

wget http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso


worked just fine for me but from what I read some people had intermittent issues so
this should take care of any erroring

wget -c --tries=0 --no-check-certificate --content-disposition http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso



This will work until VastOne changes the name of the ISO at which point you would just change the name of the download in the wget command
Don't Be A Dick!

VastOne

Not the case lwfitz...

http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso

is the vsido.org URL location

The URL for the Google Drive ISO is

https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
VSIDO      VSIDO Change Blog    

    I dev VSIDO

VastOne

And everything I checked on the internet today to get wget to work on Google Drive is a NO CAN DO

And that is all right because the files is wget'able from vsido.org so it is a none issue

But I will add your instructions to the How To/Download page for wget, if there is a How To done for it

Thanks lwfitz!
VSIDO      VSIDO Change Blog    

    I dev VSIDO