I have built, installed, tested and uploaded the latest ISO to the host
The debian ISO is at the latest Sid levels and latest kernel
If anyone is having download issues, Sector11 has tested getting the ISO with wget after a failed download
To use wget, in terminal just cd to where you want them and run:
Debian
wget http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso
I have also added a link to the download page to assist with wget downloads
Thank you Sector11! 8) 8)
The above 'wget' command puts the ISO into the directory you are currently in and uses all your bandwidth.
I don't like it when downloading a file uses ALL my bandwidth and I have to sit around twiddling my thumbs. These are multitasking machines so I decided to start multitasking online while "wget" is "web-getting" a file.
I have a 3MB connection and my max down-speed is ±380K
So this is what I did, I created a couple of bash aliases:
## limit wget to 200K
alias wg2='wget --limit-rate=200k'
## limit wget to 100K
alias wg1='wget --limit-rate=100k'
Let's use the latest VSIDO ISO md5sum file as an example:
1. I open SpaceFM and cruise over to: /media/5/VSIDO_ISO
- > right click on the folder window
- > New
- > Folder
- > type in: 2013-02-04
- > Enter
- > Enter and the bar at the top now reads:
/media/5/VSIDO_ISO/2013-02-04
2. Copy that (no pasting yet jedi see #3)
3. Open a terminal and type 'cd' and paste that line in there, hit enter:
sector11 @ sector11
04 Feb 13 | 12:49:58 ~
$ cd /media/5/VSIDO_ISO/2013-02-04
sector11 @ sector11
04 Feb 13 | 12:50:12 /media/5/VSIDO_ISO/2013-02-04
$
4. Now on the VSIDO download page, right click on the link you want and select:
- Copy Link Location.
5. Back to the terminal and type in wg1 (I clicked on md5sum) and paste in the link, Enter:
sector11 @ sector11
04 Feb 13 | 13:22:23 ~
$ cd /media/5/VSIDO_ISO/2013-02-04
sector11 @ sector11
04 Feb 13 | 13:22:34 /media/5/VSIDO_ISO/2013-02-04
$ wg1 http://vsido.org/debian/vsido_v1-2_3.7-1_Kernel.iso.md5
--2013-02-04 13:22:50-- http://vsido.org/debian/vsido_v1-2_3.7-1_Kernel.iso.md5
Resolving vsido.org (vsido.org)... 46.30.211.55
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers.
Retrying.
--2013-02-04 13:23:01-- (try: 2) http://vsido.org/debian/vsido_v1-2_3.7-1_Kernel.iso.md5
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers.
Retrying.
--2013-02-04 13:23:11-- (try: 3) http://vsido.org/debian/vsido_v1-2_3.7-1_Kernel.iso.md5
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers.
Retrying.
--2013-02-04 13:23:21-- (try: 4) http://vsido.org/debian/vsido_v1-2_3.7-1_Kernel.iso.md5
Connecting to vsido.org (vsido.org)|46.30.211.55|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 62 [application/x-iso9660-image]
Saving to: 'vsido_v1-2_3.7-1_Kernel.iso.md5'
100%[======================================>] 62 --.-K/s in 0s
2013-02-04 13:23:27 (7.21 MB/s) - 'vsido_v1-2_3.7-1_Kernel.iso.md5' saved [62/62]
sector11 @ sector11
04 Feb 13 | 13:23:27 /media/5/VSIDO_ISO/2013-02-04
$
And there it is:
/media/5/VSIDO_ISO/2013-02-04/vsido_v1-2_3.7-1_Kernel.iso.md5
That reads:
9626c4fa5914b06f8565d24365f6fc25 vsido_v1-2_3.7-1_Kernel.iso
By the way:
- I did this while another terminal is using wg2 to get the ISO, and
- I collected email twice
^ That should be a How To
Nice...
I can move it if you want - it will give me practice.
But since you are pointing here for wget, it makes sense here.
Use wget with the -c option; from the manpage:
-c, --continue resume getting a partially-downloaded file.
very handy for large files like iso images in case your bandwidth gets interrupted.
wget -c <the file being downloaded>
I think this should be made into a How To and when that is complete I will change to the link to that How To
I do believe there is more information that should be included
Thanks
Quote from: PackRat on March 11, 2013, 12:24:41 PM
Use wget with the -c option; from the manpage:
-c, --continue resume getting a partially-downloaded file.
very handy for large files like iso images in case your bandwidth gets interrupted.
wget -c <the file being downloaded>
Do you think that telling users
QuoteIf you have issues downloading, please use
wget http://vsido.org/debian/vsido_v1-2_3.8-3_Kernel.iso -c
is enough on the download page rather than a How To?
I am thinking it is
-c will work if wget is "stopped" for some reason by you ... for example; to do other things. But it requires a bit of understanding as well.
The "continue" function of wget is a default and not necessary if you are going to start wget and let it run it's course.:
QuoteNote that you don't need to specify this option if you just want
the current invocation of Wget to retry downloading a file should
the connection be lost midway through. This is the default
behavior. -c only affects resumption of downloads started prior to
this invocation of Wget, and whose local files are still sitting
around.
The upsides of "-c"
QuoteOn the other side of the coin, while using -c, any file that's
bigger on the server than locally will be considered an incomplete
download and only "(length(remote) - length(local))" bytes will be
downloaded and tacked onto the end of the local file. This
behavior can be desirable in certain cases---for instance, you can
use wget -c to download just the new portion that's been appended
to a data collection or log file.
The downsides of "-c"
QuoteBeginning with Wget 1.7, if you use -c on a non-empty file, and it
turns out that the server does not support continued downloading,
Wget will refuse to start the download from scratch, which would
effectively ruin existing contents. If you really want the
download to start from scratch, remove the file.
Also beginning with Wget 1.7, if you use -c on a file which is of
equal size as the one on the server, Wget will refuse to download
the file and print an explanatory message. The same happens when
the file is smaller on the server than locally (presumably because
it was changed on the server since your last download
attempt)---because "continuing" is not meaningful, no download
occurs.
However, if the file is bigger on the server because it's been
changed, as opposed to just appended to, you'll end up with a
garbled file. Wget has no way of verifying that the local file is
really a valid prefix of the remote file. You need to be
especially careful of this when using -c in conjunction with -r,
since every file will be considered as an "incomplete download"
candidate.
Another instance where you'll get a garbled file if you try to use
-c is if you have a lame HTTP proxy that inserts a "transfer
interrupted" string into the local file. In the future a
"rollback" option may be added to deal with this case.
Note that -c only works with FTP servers and with HTTP servers that
support the "Range" header.
Questions:
- Does one.com insert a "transfer interrupted" string into a local file?
- Does one.com support the "Range" header?
^ Why wouldn't they?
If wget works from the host, that answers these questions
I have no idea, I'm no expert I saw what I read and asked questions.
If it works from the host as you say then I'll need to know the answer to the Q's from this end.
Oh come on.... you stil on the fkn wget issue? Seriously, its not all of us having issues downloading from one.com, so you cant blame the host (as you so nicely pointed out up there)
Leave the issue! move on, go grab a cookie and a glass of milk ! 8)
All I really want is a nice How To for wget and I can move on... :)
1. wget file
2. wait
3. profit
or
1. wget -c file
2. wait
3. profit
You're welcome ;D
Topic locked, move on kids, nothing to see here