Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
HOWTO:Download Cache for your LAN-Http-Replicator (ver 3.0)
View unanswered posts
View posts from last 24 hours

Goto page Previous  1, 2, 3, 4, 5, 6 ... 22, 23, 24  Next  
Reply to topic    Gentoo Forums Forum Index Documentation, Tips & Tricks
View previous topic :: View next topic  
Author Message
Gherald2
Guru
Guru


Joined: 02 Jul 2003
Posts: 326
Location: Madison, WI USA

PostPosted: Wed Aug 11, 2004 4:49 pm    Post subject: Reply with quote

Like I said, strange things happen even with the correct resume command. I have had such problems from time to time, the solution is to delete the file from the cache and start over.

The way to be safest when doing a long emerge is to fork an emerge -f process:

Open up a second console and add -f to your regular emerge command. Once the first file is downloaded, go back to the first terminal and run your emerge w/o -f.

The downloads will complete sooner so you can make sure everything is fetched ok for whatever long build you are doing.
_________________
Unregistered Linux User #17598363
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Wed Aug 11, 2004 9:01 pm    Post subject: Reply with quote

carpman wrote:

This is the relevent part of make.conf from client:

Code:

# GENTOO_MIRRORS="http://www.mirror.ac.uk/sites/www.ibiblio.org/gentoo http://gentoo.oregonstate.edu http://www.ibiblio.org/pub/Linux/distributions/gentoo"



I think I can spot the primary source of the errors here. Your are using the default mirrors!!!

During the development of http-replicator I had many weird errors and I traced this back to the default mirrors. They are so overloaded that frequent timeouts and random networks errors are common. I had particular trouble with ibilio. I removed all ibiblio mirrors from my system.

Since I changed my mirrors those problems went away!!


I have two recomendations

1. Use other mirrors by uncommenting the GENTOO_MIRRORS line in your make.conf.

2. Remove any ibiblio mirrors.

emerge mirrorselect if you need help in finding fast mirrors
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Wed Aug 11, 2004 9:14 pm    Post subject: Reply with quote

Gherald wrote:
Like I said, strange things happen even with the correct resume command. I have had such problems from time to time, the solution is to delete the file from the cache and start over.



Which mirrors are you using?
Back to top
View user's profile Send private message
carpman
Advocate
Advocate


Joined: 20 Jun 2002
Posts: 2202
Location: London - UK

PostPosted: Wed Aug 11, 2004 10:31 pm    Post subject: Reply with quote

flybynite thanks for the info i will try the mirror suggestion.

However this still does not answer my question as to why http-replicator was downloading an incomplete file from the cache and reporting it as being complete?

Surley there should be a safe guard so this does not happen!
_________________
Work Station - 64bit
Gigabyte GA X48-DQ6 Core2duo E8400
8GB GSkill DDR2-1066
SATA Areca 1210 Raid
BFG OC2 8800 GTS 640mb
--------------------------------
Notebook
Samsung Q45 7100 4gb
Back to top
View user's profile Send private message
Gherald2
Guru
Guru


Joined: 02 Jul 2003
Posts: 326
Location: Madison, WI USA

PostPosted: Wed Aug 11, 2004 10:47 pm    Post subject: Reply with quote

flybynite wrote:
Gherald wrote:
Like I said, strange things happen even with the correct resume command. I have had such problems from time to time, the solution is to delete the file from the cache and start over.
Which mirrors are you using?


tds.net, gentoo.chem.wisc.edu, and a michigan lug

all very reliable AFACT, especially tds.net which I use for most things.

Do you suppose it would be possible to get http-replicator to automatically delete a failed file and retry?
_________________
Unregistered Linux User #17598363
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Thu Aug 12, 2004 2:17 am    Post subject: Reply with quote

carpman wrote:
However this still does not answer my question as to why http-replicator was downloading an incomplete file from the cache and reporting it as being complete?

Surely there should be a safe guard so this does not happen!



There is room for improvement of course, but replicator does have many safeguards to prevent this from happening. I believe it takes a double error of a broken response from a server AND a SIMULTANEOUS network error to make this happen!! This is only in theory and I have never been able to reproduce this type of error.


The most difficult question is how is replicator to know the file is incomplete?

http-replicator works like a http server similar to apache. Did apache check to make sure the page your viewing is complete? No, it can't, it just blindly retrieves html pages from a directory and sends it to your browser. Do the public mirrors md5 the packages in their directories? No they can't, most aren't even gentoo, they depend on the upstream server only sending good files and standard network error checking.

replicator could check every file against a mirror before serving that file to clients. This would mean that if the net was down, replicator wouldn't serve a file that is already in it's cache. So replicator couldn't be used to help install a gentoo system without internet access like it can now. It would also mean that you couldn't serve custom packages to system on your lan, should replicator refuse to serve those packages because they don't exist on a public mirror?

I could have replicator use some portage features to check the md5 but then replicator wouldn't run on a non-gentoo box. Many gentooers have to deal with a university/employers choice of servers. Should replicator be changed to only work with gentoo? Now it can easily work on a debian/Fedora/whatever server at a university.



Now, having said that, I do have plans to combat possible errors. repcacheman will be expanded to check replicators cache dir. This way gentooers can take advantage of portage and it's md5's but replicator itself is still very portable. Then another audit of the code in replicator can trap even more possible errors.

Remember that I have tested everything I can think of to make replicator NOT put an incomplete file in the cache. I challenge everyone to make replicator do so. I've killed processes, mangled routing, pulled network cables, unplugged power from routers, rebooted during download, etc, etc, and not once has replicator ever put an incomplete file in the cache.

repcacheman is an extra script that will run on gentoo systems that does check md5's. I will change the script to automatically check the cache, but you can check your files now.

If you want to check the files in replicator's cache now, just

Code:

mv /var/cache/http-replicator/*  /usr/portage/distfiles/
/usr/bin/repcacheman


repcacheman will import 15,000+ md5 from portage and check every file in /usr/portage/distfiles. The files that pass md5 will be moved to replicator's cache.
Back to top
View user's profile Send private message
Gherald2
Guru
Guru


Joined: 02 Jul 2003
Posts: 326
Location: Madison, WI USA

PostPosted: Thu Aug 12, 2004 2:42 am    Post subject: Reply with quote

flybynite wrote:
The most difficult question is how is replicator to know the file is incomplete?
Well why not something crude like filesize? MD5s are all well and good but let portage take care of the detail work on it's own.

It'd be nice if httpreplicator just payed attention to obvious things like file size (e.g. a 3mb incomplete when it should be a full 8mb download) and thus know when to restart downloads automatically.

But even that's more complicated that it needs to be, for a start it'd be nicer if we could configure portage to bypass the http-replicator proxy completely whenever a download fails or an MD5 doesn't match.
_________________
Unregistered Linux User #17598363
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Thu Aug 12, 2004 3:15 am    Post subject: Reply with quote

Gherald wrote:
Well why not something crude like filesize?


I guess my post wasn't clear. replicator ALREADY uses filesize!! Thats why I said the only way replicator could possible save an incomplete file is if the MIRROR doesn't send the size or sends the wrong size AND there is a network error!

This problem isn't replicator specific! Note this option from the man page of wget as proof of broken servers:
Code:

--ignore-length
           Unfortunately, some HTTP servers (CGI programs, to be more precise) send out bogus "Content-Length" headers, which makes Wget go wild, as it thinks not all the document was retrieved.  You can spot this syndrome if Wget retries getting the same document again and again, each time claiming that the   (otherwise normal) connection has closed on the very same byte.
           With this option, Wget will ignore the "Content-Length" header---as if it never existed.


Gherald wrote:
for a start it'd be nicer if we could configure portage to bypass the http-replicator proxy completely whenever a download fails or an MD5 doesn't match.



replicator would work better with some help from portage :-) If the ebuild ever gets accepted I hope to see some help from portage.
Back to top
View user's profile Send private message
Gherald2
Guru
Guru


Joined: 02 Jul 2003
Posts: 326
Location: Madison, WI USA

PostPosted: Fri Aug 13, 2004 11:55 pm    Post subject: Reply with quote

Someone with the right know-how could probably hack up a quick wrapper to emerge that would execute
Code:
http_proxy="" <previously failed emerge command>

Heck, if you wanted to get fancy it'd be feasable to automagically fork an "emerge --fetchonly" process to the background and only restart *IT* upon a failed DL. Just a convenience, of course, especially for server admins who wish to minimize downtime (or for that matter, maintenance-time)
_________________
Unregistered Linux User #17598363
Back to top
View user's profile Send private message
carpman
Advocate
Advocate


Joined: 20 Jun 2002
Posts: 2202
Location: London - UK

PostPosted: Thu Aug 19, 2004 1:19 pm    Post subject: Reply with quote

Hello, i now have http-replicator setup on net and it is working fine, except one small issue.

Before using H-R i used this script to clean old ebuild version from /usr/portage/distfiles

The problem now is that even after editing the paths in script i can't get it to work on /var/cache/htt-replicator

Any way i can get this to work or get the feature built into H-R?

cheers
_________________
Work Station - 64bit
Gigabyte GA X48-DQ6 Core2duo E8400
8GB GSkill DDR2-1066
SATA Areca 1210 Raid
BFG OC2 8800 GTS 640mb
--------------------------------
Notebook
Samsung Q45 7100 4gb
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Thu Aug 19, 2004 5:37 pm    Post subject: Reply with quote

carpman wrote:
Hello, i now have http-replicator setup on net and it is working fine, except one small issue.

Before using H-R i used this script to clean old ebuild version from /usr/portage/distfiles

The problem now is that even after editing the paths in script i can't get it to work on /var/cache/htt-replicator

Any way i can get this to work or get the feature built into H-R?

cheers


That script works for me... I don't see any reason why it shouldn't work for you... What kind of error do you get?

There are many different types of cache cleaning scripts, they all should work with http-replicator.....
Back to top
View user's profile Send private message
carpman
Advocate
Advocate


Joined: 20 Jun 2002
Posts: 2202
Location: London - UK

PostPosted: Thu Aug 19, 2004 7:27 pm    Post subject: Reply with quote

flybynite wrote:


That script works for me... I don't see any reason why it shouldn't work for you... What kind of error do you get?

There are many different types of cache cleaning scripts, they all should work with http-replicator.....


There were no errors it just did not find the old ebuilds that i knew were there?

I will double check paths and try again.
_________________
Work Station - 64bit
Gigabyte GA X48-DQ6 Core2duo E8400
8GB GSkill DDR2-1066
SATA Areca 1210 Raid
BFG OC2 8800 GTS 640mb
--------------------------------
Notebook
Samsung Q45 7100 4gb
Back to top
View user's profile Send private message
meowsqueak
Veteran
Veteran


Joined: 26 Aug 2003
Posts: 1549
Location: New Zealand

PostPosted: Sat Aug 28, 2004 11:14 pm    Post subject: Reply with quote

Would this program work properly as-is on a non-Gentoo host? I'm thinking of using my Debian server as the http-replicator proxy but I'm not sure how coupled it is to emerge/portage.
Back to top
View user's profile Send private message
rpcyan
n00b
n00b


Joined: 29 Aug 2004
Posts: 31
Location: Massachusetts, USA

PostPosted: Sun Aug 29, 2004 5:27 pm    Post subject: dns names instead of IP Reply with quote

I've been using your http-replicator for a week now and so far I have been very impressed. I currently have 6 gentoo boxes including the server and I haven't done a emerge world -uDp in awhile, therefore I have built up a descent cache and have been getting great speeds.

However, I am working in a college enviroment and I don't want anybody taking advantage of the open http proxy on the server. Somebody theoretically could use it to do "bad things" and implicate me in the process.

We have DHCP on campus that I have no control over, and I can't guarantee the IP of my client machines for any amount of time. Is it possible to make a whitelist of authorized clients based on DNS instead of IP?

Alternatively, is there any way to limit http-replicator to the gentoo mirrors in /etc/make.conf? I just set my proxy settings appropriately and I was able to pull up google.com, which is obviously not necessary for gentoo caching.

Again, great piece of software. Glad to do my part in minimizing the bandwidth loads on the public mirrors.
Back to top
View user's profile Send private message
yahewitt
n00b
n00b


Joined: 20 Oct 2003
Posts: 12

PostPosted: Mon Aug 30, 2004 12:44 am    Post subject: Reply with quote

OK -- I "sort of" have this running. I installed the http-replicator on the server & configured the *.conf details as in the Howto on the "server" & make.conf on my one "client".

When emerging something on the client that has already been emerged on the server I get the package via the LAN -- but when I emerge on the client, a copy does not seem to be kept at the server machine, so if I then emerge this same thing on the server, it downloads again. I can't see where I've screwed up the config (assuming this isn't supposed to happen!) -- can anyone suggest where to look in terms of debug information on what the proxy does with the copy on the server machine?

thanks!


[*doh*] Nevermind - I screwed up the permissions on the cache directory! It's working fine now, a neat bit of work.


Last edited by yahewitt on Mon Aug 30, 2004 12:53 am; edited 1 time in total
Back to top
View user's profile Send private message
rpcyan
n00b
n00b


Joined: 29 Aug 2004
Posts: 31
Location: Massachusetts, USA

PostPosted: Mon Aug 30, 2004 12:49 am    Post subject: Reply with quote

Well, the best way to diagnose your problem is for you to post your http-replicator.conf

I assume you have ran repchacheman?
Back to top
View user's profile Send private message
ponion
n00b
n00b


Joined: 12 Mar 2004
Posts: 8
Location: Essex UK

PostPosted: Mon Aug 30, 2004 9:11 am    Post subject: Reply with quote

I'm about to try and install Gentoo 2004.2 on a second machine.

I'de like to try and use http-replicator on my first machine to save the second machine getting everything from the internet.

I've read through this thread, and it is unclear what the latest version is that I should be using.

The first post in the thread says
Quote:

"To install on the server:
1. Download and install the ebuild in the portage overlay directory (/usr/local/portage by default), then emerge http-replicator.
Download
http://www.updatedlinux.com/replicator/http-replicator-flybynite-1.3.tar.bz2


But at the top of the page it says
Quote:

*** Unstable version including external proxy support - https://forums.gentoo.org/viewtopic.php?t=173226&start=76

Which point at 1.5 ?????

So what is the latest stable version ?

Peter.
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Tue Aug 31, 2004 4:18 pm    Post subject: Reply with quote

meowsqueak wrote:
Would this program work properly as-is on a non-Gentoo host? I'm thinking of using my Debian server as the http-replicator proxy but I'm not sure how coupled it is to emerge/portage.


Yes!!!, although I don't have a non-gentoo machine to test :-) I took care NOT to make the base cache gentoo specific for just this reason. Only my helper script repcacheman is gentoo-specific and wouldn't work with debian but isn't required with debian either! Without repcacheman, you must manually create the cache dir and set permissions when installing. The other functions in repcaceman aren't needed on debian. You just need to have some kind of cache cleaner when disk space becomes a problem. This is a script to delete files over X days old around the board somewhere.


Last edited by flybynite on Tue Aug 31, 2004 5:58 pm; edited 1 time in total
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Tue Aug 31, 2004 5:36 pm    Post subject: Re: dns names instead of IP Reply with quote

rpcyan wrote:
I've been using your http-replicator for a week now and so far I have been very impressed. (snip) I have built up a descent cache and have been getting great speeds.


Thanks!

rpcyan wrote:

However, I am working in a college environment and I don't want anybody taking advantage of the open http proxy on the server. Somebody theoretically could use it to do "bad things" and implicate me in the process.


Http-Replicator isn't an "open" proxy server! You should know that http-replicator was designed with security in mind. It has been thoroughly tested against security scanners and several security checks are in place!! ( Nessus still shows 1 false positive if you test it yourself) The IP access restrictions are part of those security checks. Another restriction is that only access to port 80 is allowed. See the code for more.

rpcyan wrote:

We have DHCP on campus that I have no control over, and I can't guarantee the IP of my client machines for any amount of time. Is it possible to make a whitelist of authorized clients based on DNS instead of IP?


Sounds like your running a cache for your friends and not the whole campus? The best thing might be to convince the campus admins how much bandwidth they could save by running a cache for everybody! Stats will be in the next release to help convince them!!

Anyway, back to your question. I've considered this already and decided for now that http-replicator isn't the right place to implement this. What you probably want is to dynamically firewall your box. There are scripts on the net that will allow you to set firewall rules based on log entries or various conditions you set. Dynamic firewall rules are the cleanest solution.

rpcyan wrote:

Alternatively, is there any way to limit http-replicator to the gentoo mirrors in /etc/make.conf? I just set my proxy settings appropriately and I was able to pull up google.com, which is obviously not necessary for gentoo caching.


There is ongoing work on this, but there isn't a clean solution yet. Portage will break if you limited the mirrors to those in /etc/make.conf

In addition, the cache would be severely limited if you try to limit the upstream mirrors to which it can connect. For example, all ebuilds contain their own "homepage" download mirror source. These can change with every sync!! Many packages are not mirrored but must be downloaded from the "homepage" server. Others are too new and aren't in the mirrors yet, etc, etc. The end result is that I can't predict the mirror that a package will come from and the users emerge could fail if there isn't a backup ftp source for the file.

If you need some filtering of requests, Http-Replicator can pass all requests to a filtering proxy such as squid or any one of the hundreds of other filtering proxies. If that isn't enough for you, you probably shouldn't be running a proxy.


In summary:

Http-Replicator is a proxy with access restrictions and security checks. It is designed with a campus LAN in mind where either you have the authority to have abusers prosecuted or have a certain level of trust in your clients. All client requests are logged along with their IP.

If you need more access limits look into dynamic firewall rules. If you want to place more limits on where your users go, http-replicator can forward all requests to a filtering proxy such as squid or other filter/access limiter.


Last edited by flybynite on Tue Aug 31, 2004 6:02 pm; edited 1 time in total
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Tue Aug 31, 2004 5:57 pm    Post subject: Reply with quote

ponion wrote:

I've read through this thread, and it is unclear what the latest version is that I should be using.


Seems clear to me :-) Actually, it probably doesn't matter.

The howto has the most tested version and complete instructions. In the latest "unstable" version the instructions only cover the changes from stable, not complete install instructions. The unstable version adds external proxy support and some prettying of the code but both work equally well.....

So, if you don't need a proxy to get to the net, just use the howto! You can upgrade later when I update the howto. If you have to have the latest, I recommend following the howto and getting it working then upgrading to the latest version.

Actually the latest unstable version is very stable :-) I just haven't updated the howto to include it yet!!
Back to top
View user's profile Send private message
rpcyan
n00b
n00b


Joined: 29 Aug 2004
Posts: 31
Location: Massachusetts, USA

PostPosted: Tue Aug 31, 2004 6:22 pm    Post subject: Re: dns names instead of IP Reply with quote

flybynite wrote:

Http-Replicator isn't an "open" proxy server! You should know that http-replicator was designed with security in mind. It has been thoroughly tested against security scanners and several security checks are in place!! ( Nessus still shows 1 false positive if you test it yourself) The IP access restrictions are part of those security checks. Another restriction is that only access to port 80 is allowed. See the code for more.


Well, the fact that I can set firefox to the http-replicator proxy and pull up any website I want worries me. I don't see a breach of the box occurring, but I do see the possibility of it being used for something it wasn't intended for.

flybynite wrote:

Sounds like your running a cache for your friends and not the whole campus? The best thing might be to convince the campus admins how much bandwidth they could save by running a cache for everybody!


Well, again I figured the best thing to lock it down so that it couldn't be used for "bad things" is to use a whitelist. Having the proxy pubic knowledge doesn't change that fact, but it does mean that I wouldn't be able to manage a whitelist. As far as the campus admins go, they're pretty Red Hat and not the friendlist folk around. Plus the fact that we have a huge pipeline, and I doubt they'd care much.

Your comment about running it through squid with filters will probably be the way to go for my situation
Back to top
View user's profile Send private message
drakos7
Apprentice
Apprentice


Joined: 21 Feb 2003
Posts: 294
Location: Rockville, MD, USA, Earth, Sol

PostPosted: Wed Sep 01, 2004 2:50 pm    Post subject: Reply with quote

Installed 2.1rc3 and it works great! Thanks flybynite! :)

:?: Does putting a script on the server in cron.daily to rm /usr/portage/distfiles/* make sense to keep things clean?
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Thu Sep 02, 2004 6:46 am    Post subject: Reply with quote

drakos7 wrote:
Installed 2.1rc3 and it works great! Thanks flybynite! :)

:?: Does putting a script on the server in cron.daily to rm /usr/portage/distfiles/* make sense to keep things clean?



No, repcacheman does that and more. Put repcacheman in cron.daily!!
Back to top
View user's profile Send private message
robfantini
Tux's lil' helper
Tux's lil' helper


Joined: 10 Jan 2004
Posts: 106
Location: Boston, Massachusetts

PostPosted: Fri Sep 03, 2004 1:32 am    Post subject: Reply with quote

Hello,
I've setup H-P on one computer and it is running.

I get this error on the client:

Connecting to 192.168.1.3:8080... connected.
Proxy request sent, awaiting response... 400 Bad Request
21:19:14 ERROR 400: Bad Request.

----------
here is part of my /etc/make.conf computer with H-P running:

http_proxy="http://192.168.1.3:8080"
RESUMECOMMAND=" /usr/bin/wget -t 5 --passive-ftp \${URI} -O \${DISTDIR}/\${FILE}"

client: same lines....

As I write this I suppose the problem is that I don't have httpd runnning on the server?
Back to top
View user's profile Send private message
robfantini
Tux's lil' helper
Tux's lil' helper


Joined: 10 Jan 2004
Posts: 106
Location: Boston, Massachusetts

PostPosted: Sat Sep 04, 2004 6:40 pm    Post subject: Reply with quote

I think that I've followed the install instructions exactally... But I get the following error when I try to emerge a program from the server:


Do you want me to merge these packages? [Yes/No] yes
>>> emerge (1 of 2) sys-apps/module-init-tools-3.0-r2 to /
>>> Downloading http://gentoo.oregonstate.edu/distfiles/modutils-2.4.26.tar.bz2
--14:25:07-- http://gentoo.oregonstate.edu/distfiles/modutils-2.4.26.tar.bz2
=> `/usr/portage/distfiles/modutils-2.4.26.tar.bz2'
Connecting to 192.168.1.3:8080... connected.
Proxy request sent, awaiting response... '
14:25:07 ERROR 400: Bad Request.
.................................................................................
this is from /var/log/http-replicator:

HttpClient 10 Received header from 192.168.1.3:47408

GET http://gentoo.oregonstate.edu/distfiles/modutils-2.4.26.tar.bz2 HTTP/1.0
User-Agent: Wget/1.9
Host: gentoo.oregonstate.edu
Accept: */*

HttpClient 10 Connecting to gentoo.oregonstate.edu
HttpServer 10 Received header from 140.211.166.134:80

HTTP/1.0 400 Bad Request
Content-Type: text/html

HttpServer 10 Closed
HttpClient 10 Closed
..........................................................................
from /etc/make.conf:

PORTDIR_OVERLAY=/bkup/portage/server/local
http_proxy="http://192.168.1.3:8080"

RESUMECOMMAND=" /usr/bin/wget -t 5 --passive-ftp \${URI} -O \${DISTDIR}/\${FILE}
"
.........................................................................


The server is behind a firewall.. I made it so port 80 is forwarded to our server. Apache is running and i can access the Apache test web page from off site.

Does anyone have a suggestion on how the 400 Bad Request error can be solved?

Also do I need to have Apache running for H-R to work?

Thanks!
Rob
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Documentation, Tips & Tricks All times are GMT
Goto page Previous  1, 2, 3, 4, 5, 6 ... 22, 23, 24  Next
Page 5 of 24

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum