Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
HOWTO:Download Cache for your LAN-Http-Replicator (ver 3.0)
View unanswered posts
View posts from last 24 hours

Goto page Previous  1, 2, 3 ... 12, 13, 14 ... 22, 23, 24  Next  
Reply to topic    Gentoo Forums Forum Index Documentation, Tips & Tricks
View previous topic :: View next topic  
Author Message
pHeel
n00b
n00b


Joined: 24 Feb 2004
Posts: 11

PostPosted: Wed May 04, 2005 5:36 am    Post subject: Reply with quote

Ok just wanted to ask if it’s my box or is there an issue with the emerge of this.

This is the attempt at emerge yields:

Quote:

emerge http-replicator
Calculating dependencies ...done!
>>> emerge (1 of 1) net-misc/http-replicator-3.0 to /
>>> Downloading http://gentoo.chem.wisc.edu/gentoo/distfiles/http-replicator_3.0.tar.gz
--01:21:09-- http://gentoo.chem.wisc.edu/gentoo/distfiles/http-replicator_3.0.tar.gz
=> `/usr/portage/distfiles/http-replicator_3.0.tar.gz'
Resolving gentoo.chem.wisc.edu... 128.104.70.13
Connecting to gentoo.chem.wisc.edu[128.104.70.13]:80... connected.
HTTP request sent, awaiting response... 404 Not Found
01:21:10 ERROR 404: Not Found.


>>> Downloading http://gertjan.freezope.org/replicator/http-replicator_3.0.tar.gz
--01:13:07-- http://gertjan.freezope.org/replicator/http-replicator_3.0.tar.gz
=> `/usr/portage/distfiles/http-replicator_3.0.tar.gz'
Resolving gertjan.freezope.org... 194.109.37.8
Connecting to gertjan.freezope.org[194.109.37.8]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 19,489 [application/x-tar]

100%[==================================================================================>] 19,489 44.32K/s

01:13:08 (44.23 KB/s) - `/usr/portage/distfiles/http-replicator_3.0.tar.gz' saved [19489/19489]

>>> md5 files ; -) http-replicator-3.0.ebuild
>>> md5 files ; -) http-replicator-2.0-r3.ebuild
>>> md5 files ; -) http-replicator-2.1_rc3-r1.ebuild
>>> md5 files ; -) http-replicator-2.1.ebuild
>>> md5 files ; -) http-replicator-2.0-r2.ebuild
>>> md5 files ; -) ChangeLog
>>> md5 files ; -) http-replicator-2.1_rc3.ebuild
>>> md5 files ; -) files/http-replicator-3.0.conf
>>> md5 files ; -) files/http-replicator-3.0.init
>>> md5 files ; -) files/http-replicator-3.0-callrepcacheman-0.1
>>> md5 files ; -) files/http-replicator-2.0-repcacheman-0.11
>>> md5 files ; -) files/http-replicator-2.0-conf-gentoo-patch
>>> md5 files ; -) files/http-replicator-3.0-repcacheman-0.21
>>> md5 files ; -) files/http-replicator-3.0-repcacheman-0.32
>>> md5 files ; -) files/http-replicator-3.0-repcacheman-0.33
>>> md5 files ; -) files/http-replicator-2.0-init
>>> md5 files ; -) files/http-replicator-2.1.conf
>>> md5 files ; -) files/http-replicator-2.1.init
>>> md5 files ; -) files/http-replicator-2.1-repcacheman-0.2
>>> md5 files ; -) files/http-replicator-2.1-repcacheman-0.32
>>> md5 files ; -) files/digest-http-replicator-2.1_rc3-r1
>>> md5 files ; -) files/digest-http-replicator-2.1
>>> md5 files ; -) files/digest-http-replicator-3.0
>>> md5 files ; -) files/digest-http-replicator-2.0-r2
>>> md5 files ; -) files/digest-http-replicator-2.0-r3
>>> md5 files ; -) files/http-replicator-2.0-repcacheman
>>> md5 files ; -) files/http-replicator-3.0-repcacheman-0.2
>>> md5 files ; -) files/http-replicator-2.0-init-gentoo-patch
>>> md5 files ; -) files/digest-http-replicator-2.1_rc3

!!! Digest verification Failed:
!!! /usr/portage/distfiles/http-replicator_3.0.tar.gz
!!! Reason: Filesize does not match recorded size


Multiple attempts same result. Anyone have any ideas?
Back to top
View user's profile Send private message
assaf
Apprentice
Apprentice


Joined: 14 Feb 2005
Posts: 152
Location: http://localhost

PostPosted: Wed May 04, 2005 5:55 am    Post subject: Reply with quote

If you're sure you have the latest ebuild you can do the following as root:

Code:

ebuild [full_path_to_ebuild_file] digest
Back to top
View user's profile Send private message
pHeel
n00b
n00b


Joined: 24 Feb 2004
Posts: 11

PostPosted: Wed May 04, 2005 6:02 am    Post subject: Reply with quote

Thanks seems to have worked will post either way once I know. Still learning all the odds and ends of this distro that one was a totally new one to me.

Thanks again :D
Back to top
View user's profile Send private message
zsoltika
l33t
l33t


Joined: 13 Nov 2003
Posts: 634
Location: Budapest, Hungary

PostPosted: Wed May 04, 2005 8:41 am    Post subject: Reply with quote

bigmacx wrote:
For Q2, I understand that when the http-replicator gets a request from the client, that it transfers the source file to the client's local dstfile directory from the HR cache directory. And if that file is not in the cache, it downloads the file first.
Now, for the http-replicator server itself, if the HR server is not setup to use it's own proxy (the PROXY= line in the make.conf), then the HR server will download the source files directly from the Internet to the HR server's dstfile directory.
So my thought was to configure the HR server to use the HR proxy. That way, emerge's on the HR server would populate the HR cache the same way the clients do and I could just delete the HR server's distfiles/* the same way I would on the clients.

As far as I understand this the two solutions you have (on the server):

  • 1) set up another script
    (this way you still have to run repcacheman)
    Code:
    #!/bin/bash
    /usr/bin/emerge $@ && /usr/bin/repcacheman

  • 2. set $DISTDIR in make.conf
    this should work but it would be good if your clients don't download while the server downloads, so set $DISTDIR (in /etc/make.conf) to the same dir as HR's cache

HTH,
Zsoltika
Back to top
View user's profile Send private message
killercow
Tux's lil' helper
Tux's lil' helper


Joined: 29 Jan 2004
Posts: 86
Location: Netherlands

PostPosted: Wed May 04, 2005 12:50 pm    Post subject: Reply with quote

Did anyone read my post? or did i ask to much black magic and voodoo from you guys?

I'd really love to have the features i discribed in my previous post, and since i don;t think portage will be changed to include them anytime soon, i'd better see what other projects might support my idea's.

The idea of install-fests also noted in this thread would also be solved with this.
Back to top
View user's profile Send private message
bigmacx
n00b
n00b


Joined: 01 Nov 2004
Posts: 9

PostPosted: Wed May 04, 2005 5:39 pm    Post subject: Reply with quote

zsoltika wrote:

  • 1) set up another script
    (this way you still have to run repcacheman)
    Code:
    #!/bin/bash
    /usr/bin/emerge $@ && /usr/bin/repcacheman

  • 2. set $DISTDIR in make.conf
    this should work but it would be good if your clients don't download while the server downloads, so set $DISTDIR (in /etc/make.conf) to the same dir as HR's cache

HTH,
Zsoltika
The #1 I understand. The #2 causes the sharing problems similar to the other, more basic, dstfile sharing by nfs/samba.

I think maybe I'm confused or simply not being clear enough. Thanks for your continued help zsoltika!

Does anyone know if its safe to just setup the http-replicator server to use the http-replicator proxy itself by configuring the make.conf on the http-replicator server in the same manner that a client is configured?
_________________
Gentoo forever baby!
Back to top
View user's profile Send private message
zsoltika
l33t
l33t


Joined: 13 Nov 2003
Posts: 634
Location: Budapest, Hungary

PostPosted: Wed May 04, 2005 7:02 pm    Post subject: Reply with quote

bigmacx wrote:
zsoltika wrote:

  • 2. set $DISTDIR in make.conf
    this should work but it would be good if your clients don't download while the server downloads, so set $DISTDIR (in /etc/make.conf) to the same dir as HR's cache

The #1 I understand. The #2 causes the sharing problems similar to the other, more basic, dstfile sharing by nfs/samba.
I think maybe I'm confused or simply not being clear enough. Thanks for your continued help zsoltika!
Does anyone know if its safe to just setup the http-replicator server to use the http-replicator proxy itself by configuring the make.conf on the http-replicator server in the same manner that a client is configured?

OK I'm reading your blue question, silly me for misunderstanding.
As the great flybynite explained it in a previous post it could work (of course), so here is a big thank again for flybynite's great work.
On our workplace three of us using http-replicator as explained there, every one of us is a server for the two others and for our own machines. Your case is much more simplier as I see: set up the clients to use the server, then set http_proxy to localhost:<port>.
Then the server will use it's own proxy. I don't know how it handles the packages missing from the cache (it will simply download it to ..distfiles/ and/or holds a copy in the cache but it should be done in the (http-replicator) script IMHO, correct me if I'm wrong.
We (in my workplace) using the same method: from cron syncing the tree, then 'emerge -uDfN world' then run repcacheman (this is one script), and it just works (in the nights...)
Back to top
View user's profile Send private message
thecooptoo
Veteran
Veteran


Joined: 27 Apr 2003
Posts: 1353
Location: UK

PostPosted: Wed May 04, 2005 8:52 pm    Post subject: connection refused?? Reply with quote

Code:
Connecting to 192.168.0.10:12000... failed: Connection refused.
!!! Couldn't download itcl3.2.1_src.tgz. Aborting.



Code:
bash-2.05b# cat /etc/make.conf |grep 12000
http_proxy=http://192.168.0.10:12000
bash-2.05b#               


ive opened a port on the firewall
Code:

router root # /etc/init.d/http-replicator status
 * status:  started
router root # cat /etc/shorewall/rules |grep 12000
ACCEPT loc              fw              tcp     12000


where is it being refused ?

EDIT
this might get confusing - ive started this as a new thread in portage&programming
_________________
join the optout - http://nhsconfidentiality.org


Last edited by thecooptoo on Sat May 07, 2005 8:23 pm; edited 1 time in total
Back to top
View user's profile Send private message
assaf
Apprentice
Apprentice


Joined: 14 Feb 2005
Posts: 152
Location: http://localhost

PostPosted: Thu May 05, 2005 6:58 am    Post subject: Reply with quote

What port and IP did you set in the conf.d file?
Does it work when the firewall is down?
Back to top
View user's profile Send private message
Master One
l33t
l33t


Joined: 25 Aug 2003
Posts: 754
Location: Austria

PostPosted: Thu May 05, 2005 9:30 am    Post subject: Reply with quote

The http-replicator is working great , but I see one unsolved issue:

If I successfully emerge a package on the server, followed by repcacheman, the DISTDIR is empty again of course, because all files downloaded by the http-replicator server get moved to the cache-dir, as intended.

If I now for some reason have to reemerge any of the already installed packages on the server, it will download the file again, because it only looks in DISTDIR, if the file is already there, and not in the cache-dir.

Is there any solution for this problem?

I think this is the major downside, to have a DISTDIR and a separate cache-dir.

Is there still no solution, for having DISTDIR=cache-dir? This would make it so much easier, also to integrate deltup the easy way.
_________________
Las torturas mentales de la CIA
Back to top
View user's profile Send private message
zsoltika
l33t
l33t


Joined: 13 Nov 2003
Posts: 634
Location: Budapest, Hungary

PostPosted: Thu May 05, 2005 9:39 am    Post subject: Reply with quote

Master One wrote:
If I now for some reason have to reemerge any of the already installed packages on the server, it will download the file again, because it only looks in DISTDIR, if the file is already there, and not in the cache-dir.
Is there any solution for this problem?

Check flybynite's post about it ...
Back to top
View user's profile Send private message
Master One
l33t
l33t


Joined: 25 Aug 2003
Posts: 754
Location: Austria

PostPosted: Thu May 05, 2005 10:13 am    Post subject: Reply with quote

I could not find any related into following the link you provided, zsoltika.

But I already found the solution for the mentioned issue: localhost has to be defined as http_proxy in make.conf on the server machine, then it will download the desired file from the cache-dir of http-replicator to DISTDIR. :wink:
_________________
Las torturas mentales de la CIA
Back to top
View user's profile Send private message
zsoltika
l33t
l33t


Joined: 13 Nov 2003
Posts: 634
Location: Budapest, Hungary

PostPosted: Thu May 05, 2005 10:38 am    Post subject: Reply with quote

Master One wrote:
I could not find any related into following the link you provided, zsoltika.

In that post flybynite points to:
Quote:

In each users /etc/make.conf set http_proxy="their own http-replicator"... Then it will work as you requested!! Downloads will come from the cache first, your friends cache next, then your box will download the file from the internet. All files you've downloaded will be available to your friends.
Whats the trick you ask? Http-replicator can serve cache content as a normal http server. The alias option allows requests for different dirs to be served from different dirs on disk.

Cheers,
Zsoltika
Back to top
View user's profile Send private message
assaf
Apprentice
Apprentice


Joined: 14 Feb 2005
Posts: 152
Location: http://localhost

PostPosted: Thu May 05, 2005 10:42 am    Post subject: Reply with quote

I'm also using http-replicator in a symmetric configurations suggested on page 12 (two machines on a LAN that try to download from each other before going to the net). It works fine and I think it should be mentioned in the first post as it is a very common situation. Not everyone has a gateway server which is always on.

Enhancement request:
I want to be able to configure the proxy to only look in the other server during specific time ranges in a day (i.e. I don't want to download stuff from my office computer to my home computer after working hours because that may be considered abuse, or perhaps on weekends the computer is off and i'm needlessly waiting for the connection to time out before it goes to the next mirror)
Back to top
View user's profile Send private message
Master One
l33t
l33t


Joined: 25 Aug 2003
Posts: 754
Location: Austria

PostPosted: Thu May 05, 2005 11:33 am    Post subject: Reply with quote

zsoltika wrote:
Master One wrote:
I could not find any related into following the link you provided, zsoltika.

In that post flybynite points to:
Quote:

In each users /etc/make.conf set http_proxy="their own http-replicator"... Then it will work as you requested!! Downloads will come from the cache first, your friends cache next, then your box will download the file from the internet. All files you've downloaded will be available to your friends.
Whats the trick you ask? Http-replicator can serve cache content as a normal http server. The alias option allows requests for different dirs to be served from different dirs on disk.

Ups, sorry, zsoltika, you were right of course, I seem to be a little distracted today...
_________________
Las torturas mentales de la CIA
Back to top
View user's profile Send private message
assaf
Apprentice
Apprentice


Joined: 14 Feb 2005
Posts: 152
Location: http://localhost

PostPosted: Sun May 08, 2005 9:39 am    Post subject: Reply with quote

assaf wrote:

Enhancement request:
I want to be able to configure the proxy to only look in the other server during specific time ranges in a day (i.e. I don't want to download stuff from my office computer to my home computer after working hours because that may be considered abuse, or perhaps on weekends the computer is off and i'm needlessly waiting for the connection to time out before it goes to the next mirror)


I've added --timeout=15 to the FETCH_COMMAND to prevent waiting on a connection to the peer when its not there, but it would still be nice if there were a way to configure this for the proxy.
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Mon May 09, 2005 2:42 am    Post subject: Reply with quote

Sorry it took so long to get back to you, I travel frequently and don't always have good net access. I'll try to answer the questions still remaining.....


bigmacx wrote:

Does anyone know if its safe to just setup the http-replicator server to use the http-replicator proxy itself by configuring the make.conf on the http-replicator server in the same manner that a client is configured?


Yes, read the howto again....

Quote:

2. Modify /etc/make.conf on both the server and your other gentoo boxes.


Portage on the http-replicator server doesn't even know http-replicator is on the same box. Portage on the client and server work exactly the same and are setup exactly the same!
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Mon May 09, 2005 2:57 am    Post subject: Re: connection refused?? Reply with quote

thecooptoo wrote:
Code:
Connecting to 192.168.0.10:12000... failed: Connection refused.
!!! Couldn't download itcl3.2.1_src.tgz. Aborting.


where is it being refused ?


What does /var/log/http-replicator.log say? Are you sure http-replicator is running?
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Mon May 09, 2005 3:43 am    Post subject: Reply with quote

Master One wrote:

I think this is the major downside, to have a DISTDIR and a separate cache-dir.

Is there still no solution, for having DISTDIR=cache-dir? This would make it so much easier, also to integrate deltup the easy way.


This comes up from time to time...
I don't see having a separate cache dir as a problem, but as a feature :-)

1. http-replicator can run on "other" distro's and doesn't need gentoo. This is a reality because many situations at work, school etc you don't own the LAN and may face resistance in converting everything to gentoo :-) Http-replicator actually started on and runs just fine on debian, but I don't like to mention that.......(Sorry Gertjan..)

2. Portage thinks it owns the DISTDIR and I can't change portage. Since portage isn't designed to share, it is a really bad neighbor and leaves corrupt, incomplete and just plain garbage in the DISTDIR. Imagine http-replicator downloading and saving file.tar.gz while portage is downloading and saving file.tar.gz in the same dir????

3. This situation only exists on a gentoo box running http-replicator. My support script repcacheman handles this exact situation and makes it a non event. After repcacheman runs, there are no dups, no wasted space and no problems :-)

I don't know what the problem is with deltup you mentioned.

Http-replicator has an "alias" feature that serves any dir as a standard http server. This is how it serves binary packages as a PORTAGE_BINHOST and can serve anything else you want from any dir you want.


Last edited by flybynite on Mon May 09, 2005 4:08 am; edited 1 time in total
Back to top
View user's profile Send private message
flybynite
l33t
l33t


Joined: 06 Dec 2002
Posts: 620

PostPosted: Mon May 09, 2005 4:00 am    Post subject: Reply with quote

assaf wrote:
I'm also using http-replicator in a symmetric configurations suggested on page 12 (two machines on a LAN that try to download from each other before going to the net). It works fine and I think it should be mentioned in the first post as it is a very common situation. Not everyone has a gateway server which is always on.


I didn't realize that might be a common situation. I might add a note to the howto..

assaf wrote:
Enhancement request:
I want to be able to configure the proxy to only look in the other server during specific time ranges in a day (i.e. I don't want to download stuff from my office computer to my home computer after working hours because that may be considered abuse, or perhaps on weekends the computer is off and i'm needlessly waiting for the connection to time out before it goes to the next mirror)


I'm not sure how many others would need that feature. Actually I can't imagine why you would ever want to use your work computer as a cache when your at home? I'm assuming that your internet connection to your work computer isn't any faster than your internet connection to the other mirrors. I guess you would be saving the official mirrors bandwidth....


Last edited by flybynite on Mon May 09, 2005 4:12 am; edited 1 time in total
Back to top
View user's profile Send private message
Gherald
Veteran
Veteran


Joined: 23 Aug 2004
Posts: 1399
Location: CLUAConsole

PostPosted: Mon May 09, 2005 4:06 am    Post subject: Reply with quote

assaf wrote:
Enhancement request:
I want to be able to configure the proxy to only look in the other server during specific time ranges in a day (i.e. I don't want to download stuff from my office computer to my home computer after working hours because that may be considered abuse, or perhaps on weekends the computer is off and i'm needlessly waiting for the connection to time out before it goes to the next mirror)

Simply use a bash script or alias that exports alternate environment variables, and use it to wrap emerge.
Back to top
View user's profile Send private message
assaf
Apprentice
Apprentice


Joined: 14 Feb 2005
Posts: 152
Location: http://localhost

PostPosted: Mon May 09, 2005 6:20 am    Post subject: Reply with quote

Gherald wrote:
assaf wrote:
Enhancement request:
I want to be able to configure the proxy to only look in the other server during specific time ranges in a day (i.e. I don't want to download stuff from my office computer to my home computer after working hours because that may be considered abuse, or perhaps on weekends the computer is off and i'm needlessly waiting for the connection to time out before it goes to the next mirror)

Simply use a bash script or alias that exports alternate environment variables, and use it to wrap emerge.


That's a pretty good idea. Thanks.
Back to top
View user's profile Send private message
assaf
Apprentice
Apprentice


Joined: 14 Feb 2005
Posts: 152
Location: http://localhost

PostPosted: Mon May 09, 2005 6:22 am    Post subject: Reply with quote

Wouldn't it make sense for http-replicator to stop downloading a file if there are no clients requesting it, i.e. if i started emerging something and aborted, the file will continue to download (what should I do then if i really don't want the file - restart http-replicator?). It can always resume downloading when someone requests the file again.
Back to top
View user's profile Send private message
Master One
l33t
l33t


Joined: 25 Aug 2003
Posts: 754
Location: Austria

PostPosted: Mon May 09, 2005 10:28 am    Post subject: Reply with quote

flybynite wrote:
Master One wrote:

I think this is the major downside, to have a DISTDIR and a separate cache-dir.

Is there still no solution, for having DISTDIR=cache-dir? This would make it so much easier, also to integrate deltup the easy way.


This comes up from time to time...
I don't see having a separate cache dir as a problem, but as a feature :-)

Ok, convinced :wink:

flybynite wrote:
I don't know what the problem is with deltup you mentioned.

It doesn't work, I played arround to combine http-replicator & deltup for two days, but it's a no go. I had it up and running, so that when trying to emerge something on the machine running http-replicator, it downloaded just the delta files and build the resulting archive, but this does not work out when trying to emerge anything from a client machine, which is not already in the http-replicator-cache, because otherwise http-replicator starts downloading the requested file(s) for itself with it's own download routine. I think this problem only can be solved, if http-replicator would use portage's FETCHCOMMAND defined in make.conf (because this is the only way, how the getdelta.sh script could be involved) instead of its own download routine.

Any chance for an implementation?

The combination of http-replicator & deltup would be the ultimate download solution, it was amazing to see the saved download data volume when trying deltup.
_________________
Las torturas mentales de la CIA
Back to top
View user's profile Send private message
bigmacx
n00b
n00b


Joined: 01 Nov 2004
Posts: 9

PostPosted: Mon May 09, 2005 8:21 pm    Post subject: Reply with quote

flybynite wrote:
Yes, read the howto again....

Quote:

2. Modify /etc/make.conf on both the server and your other gentoo boxes.
Yes, clearly it does say that. Thanks for pointing it out.

I think I've got my situation handled now. I'll explain my present configuration in order to help others implement a fully contained setup and entertain possible constructive comments on how to improve this arrangement.

BACKGROUND: I wanted to have a method to auto-update my gentoo PC's. That is, auto emerge-sync and auto emerge-update. I also wanted to get email'd status messages back on results and to use a shared dstfiles location. For me, I used built-in gentoo commands, custom scripts, fcron, postfix, http-replicator, and rsync. The first 4 items are beyond the scope of this thread, so I'll just show those entries and describe the http-replicator and rsync parts.

On the Server:
1. Install fcron, postfix, http-replicator.

2. Configure http-replicator server as per howto.

3. Configure fcron normally.

4. Configure postfix as appropriate (ssmtp will work).

5. Add this kind of cron entry for root.

/etc/crontab
Code:
&bootrun(true),exesev(false),first(5),mail(true),mailto(xxxxx@comcast.net),serial(true) 0 3 * * *      /etc/scripts/emergesync
&bootrun(true),exesev(false),first(15),mail(true),mailto(xxxxx@comcast.net),serial(true) 0 5 * * *      /etc/scripts/emergeupdate

These lines are almost all about fcron. Basically, I wanted the emerge scripts to run everyday and to run if the machine was down overnight. The important part here is that the server was configured to "emerge sync" at 3AM and "emerge update" at 5AM.

6. Add the 2 scripts to /etc/scripts:

/etc/scripts/emergesync
Code:
#!/bin/sh
emerge sync --nospinner

/etc/scripts/emergeupdate
Code:
#!/bin/sh
emerge -p --nospinner --update world &&
emerge --nospinner --update world &&

repcacheman &&

rm -vrf /usr/portage/distfiles/*
rm -vrf /var/tmp/portage/*

This script does the actual updating and cleans up portage aftermath.

7. Configure rsyncd on the server:

/etc/rsync/rsyncd.conf
Code:
[cache]
path = /var/cache/http-replicator
comment = Gentoo Linux dstfiles mirror
read only = false


This configures the rsyncd server to allow clients to upload to the rsyncd server. There is much more to configuring the rsyncd server and is documented in this thread:
https://forums.gentoo.org/viewtopic.php?t=59134&highlight=rsync

8. Start all server services and verify proper operation.


On each Client:
1. Install fcron, postfix.

2. Copy these files from the http-replicator server to the client (preserve directory structure):
Code:
/etc/conf.d/http-replicator
/usr/bin/repcacheman
/usr/bin/repcacheman.py
You could just as well install the full http-replicator package on the client but I just needed these 3 files.

3. Configure fcron normally.

4. Configure postfix as appropriate (ssmtp will work).

5. Add this kind of cron entry for root.

/etc/crontab
Code:
&bootrun(true),exesev(false),first(5),mail(true),mailto(xxxxx@comcast.net),serial(true) 0 4 * * *      /etc/scripts/emergesync
&bootrun(true),exesev(false),first(15),mail(true),mailto(xxxxx@comcast.net),serial(true) 0 5 * * *      /etc/scripts/emergeupdate

Same file basically. Important part here is that the client was configured to "emerge sync" at 4AM and "emerge update" at 5AM.

6. Add the 2 scripts to /etc/scripts:

/etc/scripts/emergesync
Code:
#!/bin/sh
emerge sync --nospinner

/etc/scripts/emergeupdate
Code:
#!/bin/sh
emerge -p --nospinner --update world &&
emerge --nospinner --update world &&

repcacheman &&

rm -vrf /usr/portage/distfiles/*
rm -vrf /var/tmp/portage/*

rsync /var/cache/http-replicator/* rsync://xxxxx/cache
rm -vrf /var/cache/http-replicator/*

This script does the actual updating and cleans up portage aftermath. On the clients, repcacheman moves valid source files into /var/cache/http-replicator and then rsync copies them up to the central source location.

7. Start all server services and verify proper operation.


COMMENTS:
This is working for me now. I haven't uber engineered all the aspects, so there may well be more effective ways to do parts of the above. If you see any improvements that can be made, please post them!

The way I read the original HOWTO, there was an emphasis on repeated running of repcacheman. This is mentioned as important to do on the server, but not mentioned concerning the clients.

Now I understand the continued need to run repcacheman on the server is due to server emerges. After the server emerges, it is possible that not all files used during the emerge will be in the cache, but will be left in the dstfiles directory. So running repcacheman after a server based emerge moves these "finds" into the http-replicator cache.

What function will these newly moved files have in the future? Doubtful. The original server emerge, using the server as it's own proxy, did not populate the cache directory with these files, but did populate the dstfile directory. How would any future user of the cache benefit from these new files? It would seem that the client would use whatever out-of-band method to get these files just like the server did.

So I assume in this case, that the continued use of repcacheman is just to centralize all the sources. This is my GOAL here. And having assumed that, I wondered: if we need to run repcacheman on the server, what about the clients? Well, you need to handle that case also to achieve the goal.

Others in this thread have mentioned running multiple copies of http-replicator on several PC's. I wanted a simple client-server setup and just installed the parts of http-replicator that I needed to make the clients work.

I used rsync to upload back to the server because I wanted to stay away from shared nfs/samba source directories and I hope that rsynd will handle the multiple client update problem inherent with just sharing dstfiles over nfs/samba.

Thanks for your work flybynite and all the others in this thread. Any constructive feedback would be greatly appreciated!
_________________
Gentoo forever baby!
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Documentation, Tips & Tricks All times are GMT
Goto page Previous  1, 2, 3 ... 12, 13, 14 ... 22, 23, 24  Next
Page 13 of 24

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum