VSIDO Community

VSIDO Support => General Support => Topic started by: lwfitz on April 01, 2013, 05:25:03 AM

Title: Unable to download <SOLVED>
Post by: lwfitz on April 01, 2013, 05:25:03 AM
I guess the title says it all. I am unable to download directly, it gets to about 200mb or so and then errors. So I tried wget and that is even less stable. Wget will download between 20 and 200mb but then terminates.

Luckily I have an older ISO (January I think) so Im ok and can update and run the update script.

I know this has been addressed but has anyone come up with a fix or reason for this?
Title: Re: Unable to download
Post by: dizzie on April 01, 2013, 05:37:23 AM
try add -c to wget (wget -c <location://file>


-c is for resume or somthing
Title: Re: Unable to download
Post by: lwfitz on April 01, 2013, 06:00:11 AM
No luck. This is what Im getting

Code: [Select]
luke@VSIDO-FX:~$ wget -c http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
--2013-03-31 22:58:13--  http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
Resolving vsido.org (vsido.org)... 46.30.211.55
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 670040064 (639M), 669017256 (638M) remaining [application/x-iso9660-image]
Saving to: ‘vsido_v1-2_3.8-3_Kernel.iso’

 3% [>                                      ] 22,073,143   332KB/s  eta 44m 32sTerminated


I just installed with an older image so its not really a big deal for me right at this moment
Title: Re: Unable to download
Post by: dizzie on April 01, 2013, 11:31:05 AM
Let me dropbox the iso for you and everyone else (for now)

Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 02:14:36 PM
WHAT?

Reverting back to Dropbox?

There has got to be a reason for this... Why is it so random?  Why have some never seen it?  I can download all day and never see it

I have a cap file, it is time to break it down
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 02:36:23 PM
I have changed the location of the file to a Google Drive (https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing) source until this is figured out

Can you test getting it again lwfitz, or anyone with issues while I re evaluate one.com
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 02:38:38 PM
Now having said all this and having done all this, it is Monday

I have only had issues with one.com on Mondays... It is like a mass recovery is happening that effects everyone.. time outs, downloads, etc etc
Title: Re: Unable to download
Post by: dizzie on April 01, 2013, 03:15:48 PM
I havent had any issues downloading, not even with crap-o-matic (MSIE)


A google drive link would be nice for now, since i have no issues downloading, i cant really test it. We need lwfitz and jedi for that (and others)

Title: Re: Unable to download
Post by: jefsview on April 01, 2013, 05:40:05 PM
Current download completed successfully.
Title: Re: Unable to download
Post by: dizzie on April 01, 2013, 05:43:57 PM
Just lwfitz that suck :)
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 05:45:32 PM
Current download completed successfully.

How fast?  Typical to what you would normally get?
Title: Re: Unable to download
Post by: lwfitz on April 01, 2013, 05:48:45 PM
Thanks guys, downloading now
Title: Re: Unable to download
Post by: dizzie on April 01, 2013, 05:56:14 PM
@VastOne somewhere between 30-80 kb/sec, everywhere else i get 2mb/sec
Title: Re: Unable to download
Post by: lwfitz on April 01, 2013, 06:01:00 PM
The Download from Google drive is WAY faster for me. All the previous downloads that failed said about 30min to download and on Google dive it finished in under 6min.
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 07:55:44 PM
^ I am also getting faster speeds with the Google Drive Download
Title: Re: Unable to download
Post by: Sector11 on April 01, 2013, 08:26:55 PM
I have always had problems with one.com, regardless of the day.

Clicking on the link and downloading in Iceweasel is a 100% failure rate for me.
And I'll add, at times just logging in for me is a pain - I know you've heard that before - but there it is again, not just Mondays.

@ dizzie: wget -c = 'continue' and may or may not be a good idea:
Code: [Select]
       -c
       --continue
           Continue getting a partially-downloaded file.  This is useful when
           you want to finish up a download started by a previous instance of
           Wget, or by another program.  For instance:

                   wget -c [url]ftp://sunsite.doc.ic.ac.uk/ls-lR.Z[/url]

           If there is a file named ls-lR.Z in the current directory, Wget
           will assume that it is the first portion of the remote file, and
           will ask the server to continue the retrieval from an offset equal
           to the length of the local file.

           Note that you don't need to specify this option if you just want
           the current invocation of Wget to retry downloading a file should
           the connection be lost midway through.  This is the default
           behavior.  -c only affects resumption of downloads started prior to
           this invocation of Wget, and whose local files are still sitting
           around.

           Without -c, the previous example would just download the remote
           file to ls-lR.Z.1, leaving the truncated ls-lR.Z file alone.

           Beginning with Wget 1.7, if you use -c on a non-empty file, and it
           turns out that the server does not support continued downloading,
           Wget will refuse to start the download from scratch, which would
           effectively ruin existing contents.  If you really want the
           download to start from scratch, remove the file.

           Also beginning with Wget 1.7, if you use -c on a file which is of
           equal size as the one on the server, Wget will refuse to download
           the file and print an explanatory message.  The same happens when
           the file is smaller on the server than locally (presumably because
           it was changed on the server since your last download
           attempt)---because "continuing" is not meaningful, no download
           occurs.

           On the other side of the coin, while using -c, any file that's
           bigger on the server than locally will be considered an incomplete
           download and only "(length(remote) - length(local))" bytes will be
           downloaded and tacked onto the end of the local file.  This
           behavior can be desirable in certain cases---for instance, you can
           use wget -c to download just the new portion that's been appended
           to a data collection or log file.

           However, if the file is bigger on the server because it's been
           changed, as opposed to just appended to, you'll end up with a
           garbled file.  Wget has no way of verifying that the local file is
           really a valid prefix of the remote file.  You need to be
           especially careful of this when using -c in conjunction with -r,
           since every file will be considered as an "incomplete download"
           candidate.

           Another instance where you'll get a garbled file if you try to use
           -c is if you have a lame HTTP proxy that inserts a "transfer
           interrupted" string into the local file.  In the future a
           "rollback" option may be added to deal with this case.

           Note that -c only works with FTP servers and with HTTP servers that
           support the "Range" header.

I have to rethink that one for my aliases, maybe:
Code: [Select]
       -t number
       --tries=number
           Set number of retries to number.  Specify 0 or inf for infinite
           retrying.  The default is to retry 20 times, with the exception of
           fatal errors like "connection refused" or "not found" (404), which
           are not retried.
would be better but I've never needed more that 4 - once I think, 3 - a few and 2 - very common.

At the moment Iceweasel is downloading it from Google Drive, and true to form it is using 100% of my band width. I'm writing this in medit while it does that.

And again - Google sees my IP Addy - Argentina - Why do I get FRENCH ?
That's a rhetorical question, not looking for or expecting an answer.
(http://t.imgbox.com/abmLzM6h.jpg) (http://imgbox.com/abmLzM6h)

Started at 16:14 local
16:19:?? - 120.5 MB - ±40min I'm guessing with a 3GB connection
16:28:11 - 300MB (I waited for it)
16:45:21 - Finished


(http://t.imgbox.com/abvbxdwB.jpg) (http://imgbox.com/abvbxdwB)
You can see a failed download at 229.7MB, still there as I was testing

Going to try 'wg2' on the Google Drive link and see what that does.
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 08:31:51 PM
Any download from anywhere will use 100% bandwidth unless you use something like wget like you have outlined

To me 5 minutes of 100% bandwidth usage is not an issue

What is the issue is failure ...  All I need to know is that all downloads are completing at 100% from the Google Drive download
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 08:33:19 PM
I have not heard any other reported issues with one.com since the very beginning

If anyone else has issues at anytime, please report them
Title: Re: Unable to download
Post by: Sector11 on April 01, 2013, 08:41:42 PM
  I know, I'm the only person ... maybe the Argentinian connection.  Who knows.

Also I cannot get wget to work with Google Drive, attachments should open in your browser. (now they are Spanish - go figure)

Code: [Select]
01 Apr 13 | 17:30:49 ~
         $ wg2 https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?pli=1
--2013-04-01 17:31:12--  https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?pli=1
Resolving docs.google.com (docs.google.com)... 173.194.42.0, 173.194.42.1, 173.194.42.5, ...
Connecting to docs.google.com (docs.google.com)|173.194.42.0|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘edit?pli=1’

    [ <=>                                                                                    ] 14,377      --.-K/s   in 0.07s   

2013-04-01 17:31:13 (213 KB/s) - ‘edit?pli=1’ saved [14377]

 
 01 Apr 13 | 17:31:13 ~
         $ wg2 https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
--2013-04-01 17:31:48--  https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
Resolving docs.google.com (docs.google.com)... 173.194.42.6, 173.194.42.7, 173.194.42.8, ...
Connecting to docs.google.com (docs.google.com)|173.194.42.6|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘edit?usp=sharing’

    [ <=>                                                                                    ] 14,380      --.-K/s   in 0.01s   

2013-04-01 17:31:49 (1.19 MB/s) - ‘edit?usp=sharing’ saved [14380]

 
 01 Apr 13 | 17:31:49 ~
         $ wget https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
--2013-04-01 17:32:16--  https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
Resolving docs.google.com (docs.google.com)... 173.194.42.8, 173.194.42.2, 173.194.42.0, ...
Connecting to docs.google.com (docs.google.com)|173.194.42.8|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘edit?usp=sharing.1’

    [ <=>                                                                                    ] 14,380      --.-K/s   in 0.05s   

2013-04-01 17:32:17 (260 KB/s) - ‘edit?usp=sharing.1’ saved [14380]

 
 01 Apr 13 | 17:32:17 ~
         $
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 08:49:25 PM
The file is still the same in the wget instructions, it is coming from vsido.org and not the Google Drive site

Since that one works fine, I have left it the same
Title: Re: Unable to download
Post by: jefsview on April 01, 2013, 09:48:17 PM
Sorry, I left the computer for a few hours.

The download was faster, but I didn't time it. Direct download always takes longer than wget. The from Google-drive completed in less time. I'm just used to waiting and waiting for a dl to complete, but it was done before I realized it. And complete.

-- Jeff
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 09:55:12 PM
No worries jefsview...

Thanks for the test and the report back, it is appreciated!  8)
Title: Re: Unable to download
Post by: lwfitz on April 01, 2013, 10:15:30 PM
I am home today and have multiple networks from multiple ISP's.....  ;D ;D...... SO I am now going to test wget and direct download on all networks. Ill update and maybe we can try to get this figured out.


Ok...... I found a a fix for wget timing out and then terminating.

Maybe someone else can test this also

Code: [Select]
wget -c --tries=0 http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso

The "tries=0" command tells wget to retry the connection to the download infinitely......

so if you wanted it to retry 5 times it would be
tries=5

and the and the -c command tells it to continue with the partial download.


Edit:

Oops it looks like Sector11 hit on this earlier but I missed his post.
Title: Re: Unable to download
Post by: VastOne on April 01, 2013, 10:28:30 PM
The vsido.org link is

http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso (http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso)

Thanks for helping test lwfitz
Title: Re: Unable to download
Post by: lwfitz on April 01, 2013, 10:33:01 PM
^thanks. Im testing now


Ok direct download from http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso (http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso) on my main network failed but wget with

Code: [Select]
wget -c --tries=0 http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
worked like a charm on my main network which is a wireless 4g Clear (https://www.clear.com/) connection.


Both direct and wget are downloading on my Verizon dsl network.


Ok so both downloads on Verizon errored but wget with the retry option continued on

Code: [Select]
luke@G73JH-VSIDO:~$  wget -c --tries=0 http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
--2013-04-01 15:54:04--  http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
Resolving vsido.org (vsido.org)... 46.30.211.55
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 670040064 (639M) [application/x-iso9660-image]
Saving to: ‘vsido_v1-2_3.8-3_Kernel.iso’

18% [======>                                ] 122,444,003  230KB/s   in 10m 1s

2013-04-01 16:04:38 (199 KB/s) - Connection closed at byte 122444003. Retrying.

--2013-04-01 16:04:39--  (try: 2)  http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 670040064 (639M), 547596061 (522M) remaining [application/x-iso9660-image]
Saving to: ‘vsido_v1-2_3.8-3_Kernel.iso’

20% [+++++++                                ] 136,576,148  349KB/s  eta 25m 23s


Now gong to try and test to see why its erroring like that
Title: Re: Unable to download
Post by: lwfitz on April 02, 2013, 12:02:49 AM
Very odd.... now direct download from the original link works fine but wget (without the retry option) still terminates
but there arent any dropped packages or errors at all. The connection just terminates.


Code: [Select]
Running for 27 mins, 51 secs, since 2013-04-01 23:31:24 UTC+0000.
Total 1,167,856,833 bytes, in 1,267,350 packets. (1,267,514 captured, 0 dropped)
Title: Re: Unable to download
Post by: VastOne on April 02, 2013, 12:21:19 AM
^ So strange, and I can never get any one of them to fail
Title: Re: Unable to download
Post by: lwfitz on April 02, 2013, 03:01:58 AM
Heres how to wget from Google drive


First we need the actual url for the ISO which in this case happens to be

Code: [Select]
http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
Now

Code: [Select]
wget http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso

worked just fine for me but from what I read some people had intermittent issues so
this should take care of any erroring

Code: [Select]
wget -c --tries=0 --no-check-certificate --content-disposition http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso


This will work until VastOne changes the name of the ISO at which point you would just change the name of the download in the wget command
Title: Re: Unable to download
Post by: VastOne on April 02, 2013, 03:05:23 AM
Not the case lwfitz...

Code: [Select]
http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
is the vsido.org URL location

The URL for the Google Drive ISO is

Code: [Select]
https://docs.google.com/file/d/0B4gKMu7RCW3eMjZKelNPVlE2S3M/edit?usp=sharing
Title: Re: Unable to download
Post by: VastOne on April 02, 2013, 03:07:53 AM
And everything I checked on the internet today to get wget to work on Google Drive is a NO CAN DO

And that is all right because the files is wget'able from vsido.org so it is a none issue

But I will add your instructions to the How To/Download page for wget, if there is a How To done for it

Thanks lwfitz!
Title: Re: Unable to download
Post by: lwfitz on April 02, 2013, 03:09:34 AM
WOW...... thats a big oops on my part...... sorry about that, your 100% correct. I thought that was to easy......


Total brain meltdown I guess. I looked at the download link and thought it looked familiar but it didnt even dawn on me. Oh well Im gonna keep at this tonight and see if I can work any magic.