PDA

View Full Version : Slow Donwloads. 6 Mbit. Should take 2 hours



binarydepth
22-Jun-2014, 04:46
What can be done ?
The download starts fine but it loses speed.

I'm asking my friends to help me download the ISO :s

TY for your attention. BD


--
binarydepth
------------------------------------------------------------------------
binarydepth's Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

malcolmlewis
22-Jun-2014, 04:46
Hi
Find a mirror close to your location from the following URL and use
that.
http://mirrors.opensuse.org/list/13.1.html


--
Cheers Malcolm °¿° SUSE Knowledge Partner (Linux Counter #276890)
openSUSE 13.1 (Bottle) (x86_64) GNOME 3.10.1
If you find this post helpful and are logged into the web interface,
please show your appreciation and click on the star below... Thanks!
------------------------------------------------------------------------
malcolmlewis's Profile: http://forums.opensuse.org/member.php?userid=740
View this thread: http://forums.opensuse.org/showthread.php?t=498991

malcolmlewis
22-Jun-2014, 05:06
Hi
Sorry, thought you were after the openSUSE dvd. So it's an image you
created on SUSE Studio? How big is the file?


--
Cheers Malcolm °¿° SUSE Knowledge Partner (Linux Counter #276890)
openSUSE 13.1 (Bottle) (x86_64) GNOME 3.10.1
If you find this post helpful and are logged into the web interface,
please show your appreciation and click on the star below... Thanks!
------------------------------------------------------------------------
malcolmlewis's Profile: http://forums.opensuse.org/member.php?userid=740
View this thread: http://forums.opensuse.org/showthread.php?t=498991

binarydepth
22-Jun-2014, 05:26
malcolmlewis;2650123 Wrote:
> Hi
> Sorry, thought you were after the openSUSE dvd. So it's an image you
> created on SUSE Studio? How big is the file?

2 GB I really need a ISO with everything :/


--
binarydepth
------------------------------------------------------------------------
binarydepth's Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

malcolmlewis
22-Jun-2014, 05:46
binarydepth;2650125 Wrote:
> 2 GB I really need a ISO with everything :/
Hi
Try wget rather than a browser eg;

Code:
--------------------

wget -c https://susestudio.com/download/<lots_of_numbers_and_characters>/<your_image>

--------------------

Else I guess their site is just running slow...


--
Cheers Malcolm °¿° SUSE Knowledge Partner (Linux Counter #276890)
openSUSE 13.1 (Bottle) (x86_64) GNOME 3.10.1
If you find this post helpful and are logged into the web interface,
please show your appreciation and click on the star below... Thanks!
------------------------------------------------------------------------
malcolmlewis's Profile: http://forums.opensuse.org/member.php?userid=740
View this thread: http://forums.opensuse.org/showthread.php?t=498991

binarydepth
26-Jun-2014, 20:16
malcolmlewis;2650126 Wrote:
> Hi
> Try wget rather than a browser eg;
> >
Code:
--------------------
> >
> wget -c https://susestudio.com/download/<lots_of_numbers_and_characters>/<your_image>
>
--------------------
> >
> Else I guess their site is just running slow...

Facepalm... Thanks :) hahhahaa


--
binarydepth
------------------------------------------------------------------------
binarydepth's Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

binarydepth
26-Jun-2014, 20:26
Code:
--------------------
wget -r 1000 -T 300 https://susestudio.com/download/...
--------------------


I suppose this results in 1000 tries with a 5 min interval, if it fails
more tha that then it's impossible.


--
binarydepth
------------------------------------------------------------------------
binarydepth's Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

hendersj
26-Jun-2014, 23:50
On Thu, 26 Jun 2014 19:26:02 +0000, binarydepth wrote:

> Code:
> --------------------
> wget -r 1000 -T 300 https://susestudio.com/download/...
> --------------------
>
>
> I suppose this results in 1000 tries with a 5 min interval, if it fails
> more tha that then it's impossible.

FWIW, downloads here are not too bad; when I'm having trouble with a
larger download, I try using aria2 instead and do a parallel download
(aria2 does a good job of segmenting a larger download and retrieving it
in multiple parts, assembling it into the original as it goes -
regardless of protocol, generally).

You might give that a try. Something like:

aria2c --max-connection-per-server=4 --min-split-size=1M [url]

That should cause it to do 4 simultaneous downloads, and download the
file in 1 MB chunks.

There are other options that may help as well, and it can also be used to
restart an aborted download (see '-c' in the help for details) so you
don't have to start over every time the download fails.

Jim

--
Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner

binarydepth
27-Jun-2014, 12:56
Jim Henderson;2650947 Wrote:
> On Thu, 26 Jun 2014 19:26:02 +0000, binarydepth wrote:
>
> > Code:
> > --------------------
> > wget -r 1000 -T 300 https://susestudio.com/download/...
> > --------------------
> >
> >
> > I suppose this results in 1000 tries with a 5 min interval, if it
> fails
> > more tha that then it's impossible.
>
> FWIW, downloads here are not too bad; when I'm having trouble with a
> larger download, I try using aria2 instead and do a parallel download
> (aria2 does a good job of segmenting a larger download and retrieving
> it
> in multiple parts, assembling it into the original as it goes -
> regardless of protocol, generally).
>
> You might give that a try. Something like:
>
> aria2c --max-connection-per-server=4 --min-split-size=1M [url]
>
> That should cause it to do 4 simultaneous downloads, and download the
> file in 1 MB chunks.
>
> There are other options that may help as well, and it can also be used
> to
> restart an aborted download (see '-c' in the help for details) so you
> don't have to start over every time the download fails.
>
> Jim
>
> --
> Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
> Novell/SUSE/NetIQ Knowledge Partner

I changed command to :
Code:
--------------------
wget -c -T 150 --tries=1000 <URL>
--------------------


That's great program. It's the way to go when downloading large files.
:D

Thanks!


--
binarydepth
------------------------------------------------------------------------
binarydepth's Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

binarydepth
27-Jun-2014, 13:06
It's doe now. Didn't take note of the "try" number :p, sorry if you are
curious. Will be testing tomorrow at most.
BD


--
binarydepth
------------------------------------------------------------------------
binarydepth's Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

binarydepth
27-Jun-2014, 21:06
I used 12 connections in 10MB packges while using WGET and ARIA won the
race. :P



Code:
--------------------
aria2c --max-connection-per-server=12 --min-split-size=10M <URL>
--------------------


(1024*2)/64=32MB per Split.

(1024*A)/B*4, where A = File size, and B = Max of connections allowed.

What do you think of that model ?


Code:
--------------------
aria2c --max-connection-per-server=16 --min-split-size=32M
--------------------


Cheers :)


--
binarydepth
------------------------------------------------------------------------
binarydepth's Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

hendersj
28-Jun-2014, 03:00
On Fri, 27 Jun 2014 20:06:01 +0000, binarydepth wrote:

> What do you think of that model ?

Ultimately, I think if the model maxes out your connection speed, it's a
good model. :)

Jim



--
Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner

binarydepth
28-Jun-2014, 04:56
Jim Henderson;2651105 Wrote:
> On Fri, 27 Jun 2014 20:06:01 +0000, binarydepth wrote:
>
> > What do you think of that model ?
>
> Ultimately, I think if the model maxes out your connection speed, it's
> a
> good model. :)
>
> Jim
>
>
>
> --
> Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
> Novell/SUSE/NetIQ Knowledge Partner

Aria should tell that if it doesn't. Don't you think ?

Of course many CLI users have enough common sense. I'm biased to
precision but Aria would stop if the download already finished.


--
binarydepth
------------------------------------------------------------------------
binarydepth's Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

hendersj
28-Jun-2014, 06:36
On Sat, 28 Jun 2014 03:56:02 +0000, binarydepth wrote:

> Aria should tell that if it doesn't. Don't you think ?

It's difficult to judge how much bandwidth is available. TCP doesn't
work like that - it has some built-in dynamic throttling based on whether
or not a request is sent and a response isn't received, but there's no in-
built way for the software to ask "how fast is my connection?". That's
why (for example) torrent speed throttling by the client is inexact; it
operates by denying the inbound data a response, so the sender will
throttle back on how fast the data is being sent in order to cut down on
retransmissions.

It also depends on how many concurrent connections the server is
configured to permit overall and per client. You might want to open 12
connections to the server, but if the server only permits 4 per client,
then you're not going to get an optimal speed over 12 connections.
Similarly, if the server is configured to limit the amount of outbound
data being sent to an individual connection or to a specific client,
that's also a factor.

As are the network links between you and the server. I guarantee you
that if you have a 30 Mbps connection (as I do) and you connect to a
server that's got 10 Mbps or 100 Mbps worth of bandwidth available to it,
but there's a high-latency 56 Kbps link between you and the server,
you're not going to max out your connection. ;)

The same is true even if you have a full 30 Mbps between you and the
server, unless you have a dedicated connection to that server, because
other people are using that bandwidth as well.

It's not such a simple problem to solve, because networks aren't simply
constructed. :)

> Of course many CLI users have enough common sense. I'm biased to
> precision but Aria would stop if the download already finished.

Naturally it would stop if the download was done - there would be no more
data to send. ;)

Jim

--
Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner