Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
HOWTO: Central Gentoo Mirror for your Internal Network
View unanswered posts
View posts from last 24 hours

Goto page Previous  1, 2, 3, 4, 5, 6  Next  
Reply to topic    Gentoo Forums Forum Index Documentation, Tips & Tricks
View previous topic :: View next topic  
Author Message
rickj
Guru
Guru


Joined: 06 Feb 2003
Posts: 359
Location: Calgary, Alberta, Canada

PostPosted: Mon Jan 05, 2004 7:07 pm    Post subject: Reply with quote

The original post is a great method, and works well for me. Saves time and net bandwidth.

Just a minor buglet:

Quote:
FETCHCOMMAND=“rsync rsync://<your Portage gateway’s IP or DNS>/gentoo-packages/\${FILE} {DISTDIR}”


seems to be missing a $, mine works as:

Code:
FETCHCOMMAND=“rsync rsync://<your Portage gateway’s IP or DNS>/gentoo-packages/\${FILE} ${DISTDIR}”


Thanks for a truly useful HOWTO.
Back to top
View user's profile Send private message
megalomax
n00b
n00b


Joined: 06 Feb 2003
Posts: 52
Location: germany

PostPosted: Fri Jan 16, 2004 3:13 pm    Post subject: what to do when the package is not on the serve? Reply with quote

Hi there!

I really LOVE your setup. But all the other suggestions are really confusing me like hell 8O

I still try to figure out what to do, if a package is not on the server, thus needs to be downloaded by the server and not by the client...

Is there a way to do it with the original method..? Do I need some bash IF...THEN routines (I only know very basic bash stuff, sorry)...
Or maybe to send the package back to the server when the client downloaded it (by scp or something)

Or would I have to use a different approach???

thanks for your input...

man, I love these forums
:wink:
Back to top
View user's profile Send private message
megalomax
n00b
n00b


Joined: 06 Feb 2003
Posts: 52
Location: germany

PostPosted: Fri Jan 16, 2004 4:10 pm    Post subject: @savage and all the other helpful wizards around here... Reply with quote

Hi again ...

I don't wanna spam here, but I just saw the nice work of savage...
I followed his instructions, but I'm not sure what I get on my machine.

I did some testing, and this is what happend...

machineServer: no distfile of a certain package (<fam> in my case) present
machineClinet: old version of <fam> is present, needs updating...

1) client: emerge -U fam

2) clinet tries to get the package from the server... not present...

3) client downloads the package and installs it

4) package still not present on the server...


Is there something wrong with my make.conf?
Should I see an erro when the php-Solution fails for some reason?
I just recently upgradet to apache2, but I don't know if this has something to do with all this...
Is the location of the apache-htdocs dir differernt in this case?

:roll:

cheers
Back to top
View user's profile Send private message
Ateo
Advocate
Advocate


Joined: 02 Jun 2003
Posts: 2021
Location: Republic of California

PostPosted: Sun Jan 25, 2004 3:58 am    Post subject: Reply with quote

This works excellent. I've just cut my stage 1 install times on workstations by 1/3. Thanks for the howto!!!

Gentoo Rocks!
Back to top
View user's profile Send private message
savage
Apprentice
Apprentice


Joined: 01 Jan 2003
Posts: 161

PostPosted: Mon Jan 26, 2004 5:06 pm    Post subject: an update is coming! Reply with quote

megalomax - just saw your message;

am looking into what has to be done in a current gentoo setup - will update origional post and let you know when it works.
Back to top
View user's profile Send private message
savage
Apprentice
Apprentice


Joined: 01 Jan 2003
Posts: 161

PostPosted: Mon Jan 26, 2004 7:21 pm    Post subject: A way to get files that actually works in a network environm Reply with quote

ok folks,
hollar if I am missing something or making no sense. I seem to do that sometimes :-)

This is a way that was posted earlier by me (and now that I am back in Gentoo) it doesn't seem to work. Here is what is working for me now. Please give me feedback if you want / need changes.

Code:

<?
//put in /var/www/localhost/htdocs/getFile.php on server
$packageSrc = trim($_GET["packageSrc"]);
$packageName = strrchr($packageSrc,"/");
$packageName = trim($packageName, "/");

if($packageName != "")
{
  @$fileHandle = fopen("/usr/portage/distfiles/" . $packageName, "r");
  if(!$fileHandle)
  {
    exec(escapeshellcmd("/usr/local/sbin/getPackageFromMirror $packageSrc"));
    @$fileHandle = fopen("/usr/portage/distfiles/" . $packageName, "r");
    if(!$fileHandle)
    {
      print "Unable to get File from remote Server\n";
      exit;
    }
    else
    {
      fpassthru($fileHandle);
      exit;
    }
  }
  else
  {
    fpassthru($fileHandle);
    exit;
  }
}
?>


and the C program:
Code:

/*
put in /usr/local/src
cd to /usr/local/src
gcc -s -o getPackageFromMirror getPackageFromMirror.c
cp -v getPackageFromMirror /usr/local/sbin/
chown -v root.root /usr/local/sbin/getPackageFromMirror
chmod -v 4775 /usr/local/sbin/getPackageFromMirror
*/
#include <stdio.h>
#include <unistd.h>
#include <string.h>
#include <strings.h>
#include <errno.h>

int main(int argc, char* argv[])
{
  extern int errno;
  int i=0;
  char wgetCommand[1024];
  char fileTarget[1024];
  char* tok;

  if ((argc < 2) || (argc > 3))
  {
    printf("Usage: %s <packageToRetrieve> [packageName]\n", argv[0]);
    return(-1);
  }

  memset(wgetCommand,0,1024);
  memset(fileTarget,0,1024);

  snprintf(wgetCommand,1024,argv[1]);
  if(argc == 3)
  {
    snprintf(fileTarget,1024,"/usr/portage/distfiles/%s",argv[2]);
  }
  else
  {
    tok = strrchr(wgetCommand,'/');
    if(tok != NULL)
    {
       snprintf(fileTarget,1024,"/usr/portage/distfiles/%s",tok+1);
    }
  }
  execl("/usr/bin/wget","wget","-q","-N","-O",fileTarget,wgetCommand, NULL);
  //printf("%s\n",strerror(errno));
}


put this on your "proxy box"
and update your "FETCHCOMMAND" in "/etc/make.conf" to be:
Code:

FETCHCOMMAND="/usr/bin/wget -t 5 -O \${DISTDIR}/\${FILE} http://[proxyBoxHere]/getFile.php?packageSrc=\${URI}"


Let me know.

Edit: added exit; s after termination - thanks to Aneurysm9


Last edited by savage on Sun Feb 08, 2004 10:31 pm; edited 2 times in total
Back to top
View user's profile Send private message
Aneurysm9
n00b
n00b


Joined: 08 Feb 2004
Posts: 21

PostPosted: Sun Feb 08, 2004 2:07 am    Post subject: Reply with quote

I'm not sure if it's related to my PHP setup or to your script, but it was adding a newline to the end of the files it was sending me, resulting in failed MD5 checks. I eventually figured out that adding "exit;" at the end of the main if loop fixed the problem.
Back to top
View user's profile Send private message
not_registered
Tux's lil' helper
Tux's lil' helper


Joined: 04 Feb 2003
Posts: 148

PostPosted: Sat Mar 06, 2004 5:02 am    Post subject: Reply with quote

I don't know what I'm talking about, but can't you use SQUID to do this somehow?
_________________
It's Floam, it's Floam. It's flying foam!
Back to top
View user's profile Send private message
savage
Apprentice
Apprentice


Joined: 01 Jan 2003
Posts: 161

PostPosted: Mon Mar 08, 2004 1:06 pm    Post subject: Squid question Reply with quote

Yes!

You can use squid to do this, but all of the files are stored in a human unreadable format in caching directories (looks like ac3x5rvfdaiwldk) instead of reiserfsprogs-xxxx; etc. Also, when you are burning hard disk space to store that and your distfiles on a computer, you are shelling out 2x as much hard disk space as if you do it the way above.

savage
Back to top
View user's profile Send private message
linkfromthepast
n00b
n00b


Joined: 18 Mar 2004
Posts: 23

PostPosted: Thu Mar 18, 2004 3:43 pm    Post subject: Another way Reply with quote

Here's a little perl script I wrote to take care of the package download and serving. We use this on our internal network of approx 100 nodes. So far everything seems to work correctly, although I'm sure there is alot of room for improvement. As far as security is concerned, there is none. I'm sure alot can be built in, but we filter by address w/ ipfilter so I didn't feel it was necessary. For some this may be complete garbage, others may use it, but if you find anything in it useful then I think it was worth posting.

File: dist.pl
Code:


#!/usr/bin/perl
########################################################
#       GENTOO LOCAL MIRROR
#This script provides localmirror functionality for gentoo
#Simply set the mirror on the client machines to the webserver
#that is server this script
########################################################

########################################################
#       OUTLINE
# 1.) Get request for a file, if the file is not present, a 404 error takes place and through .htaccess this script is called
# 2.) Then the script downloads the file from one of the mirrors listed below, saves it in a web accessible directory, and sends
#       a Location tag to wget that emerge is using to redirect it to the file
# 3.) If the file DOES exist, the script simply redirects the browser
#
#
########################################################

########################################################
#       INSTALL
# 1.) Put the .htaccess file in the directory which you want your Gentoo mirror in make.conf to point to
# 2.) Edit .htaccess to point to this script
# 3.) Make sure in apache.conf there is an entry allowing the script to run in that directory
#       EXAMPLE: ScriptAlias /dist /var/www/localhost/htdocs/
########################################################



use CGI;

#used to redirect client to new localtion of file
$address="http:///your.address/dir";
#location of wget used to get files
$wgetlocation = "/usr/bin/wget";
#switches passed to wget
$wgetswitches = "-nc -c -t 5 --passive-ftp ";
#directory to put the new dist files in
$wgetputdir = "/var/www/localhost/htdocs/distfiles";
$distdir = "distfiles/";
#mirrors to use to get the gentoo files from
@mirrors = ("ftp://ibiblio.org/pub/Linux/distributions/gentoo/"," ftp://mirror.iawnet.sandia.gov/pub/gentoo/","ftp://gentoo.ccccom.com"," http://128.213.5.34/gentoo/");



#this is the ENV var which holds the address that was attempting to be accessed
$url = $ENV{"REQUEST_URI"};
#split the input
@parts = split(/\//, $url);
#count the parts
$count = $#parts;
#get the last part, which is the filename
$filename = $parts[$count];

#do we need distdir?
#$url = $mirrors[0].$distdir.$url;

if(!(-e $wgetputdir."/".$filename))
{
#create the url for the file to get with wget
$url = $mirrors[0].$url;
$command = $wgetlocation." ".$wgetswitches." ".$url." -P ".$wgetputdir;
#run the command
open(FILE, "$command |");
$output = <FILE>;
close(FILE);
}

#create a new CGI object for redirect
$query = new CGI;
#redirect
print $query->redirect($address.$filename);




File: .htaccess
Code:


ErrorDocument 404 /dist/dist.pl
Back to top
View user's profile Send private message
La`res
Arch/Herd Tester
Arch/Herd Tester


Joined: 11 Aug 2003
Posts: 79
Location: ::0

PostPosted: Mon Mar 29, 2004 12:03 am    Post subject: Reply with quote

linkfromthepast - Could you be more detailed in the Install intructions. They seem a little vague to me. Mind you I'm relitavely new to apache.
_________________
Lares Moreau <lares.moreau@gmail.com>
LRU: 400755 http://counter.li.org
lares/irc.freenode.net
Back to top
View user's profile Send private message
linkfromthepast
n00b
n00b


Joined: 18 Mar 2004
Posts: 23

PostPosted: Wed Apr 07, 2004 4:35 pm    Post subject: Reply with quote

Which part are you having problems with?

1.) Perl script goes in a directory on your server which you have set to execute scripts in the apache.conf.
2.) Change the variables in the Perl script based on your setup.
3.) Put the .htaccess in the folder where all your dist files will live.

So when a client requests a file in that directory, and the file does not exist, the .htaccess is used, which calls the script to download the file, then redirects the client to the file to download. Sorry for the run on :)

Also, one thing I've noticed is if you don't have a bandwidth, either Perl or Apache stalls the wget process. I'm leaning towards Apache because it actually owns the wget process, but not quite sure. So as long as you can download your dist file in under ~30secs you'll be ok. Although larger dist files like those for KDE should probably be done manually until the problem is fixed.

If that is still to broad an explanation, please post some specific questions. Good luck :)
Back to top
View user's profile Send private message
modnemo
n00b
n00b


Joined: 10 Aug 2003
Posts: 18

PostPosted: Thu Apr 08, 2004 4:04 pm    Post subject: Emerge problem.... Reply with quote

I can rsync files no problem using rsync...

Code:
rsync rsync://192.168.0.1/gentoo-packages/zip23.tar.gz


But when I emerge anything I get this error...

Code:

>>> emerge (1 of 20) net-fs/samba-2.2.8a to /
!!! File system problem. (Bad Symlink?)
!!! Fetching may fail: [Errno 2] No such file or directory: ''


any ideas?
Back to top
View user's profile Send private message
modnemo
n00b
n00b


Joined: 10 Aug 2003
Posts: 18

PostPosted: Thu Apr 08, 2004 6:05 pm    Post subject: In make.conf type #USER=ID10T Reply with quote

OK....so nevermind my previous post...really dumb error.

Make sure (absolutey sure) that if you are typing in the variables in your make.conf file you use ${FILE} and not $(FILE) becasue it doesn't work.

Let me reiterate myself...if you are having weird errors while doing an emerge, but emerge sync works just fine... make sure when using the shell variables use { and not (

Thanks for the HOWTO, it was awesome...I was looking for a solution to emerge packages on my Fujitsu Stylistic 1200 tablet, without having to connect it to the internet (wireless support sucks for ADM8211 based cards). I looked into NFS but its alot of setup and kernel-required options which I didn't install. I was able to do the portage/rsync server in about 10 steps (and an hour of frustration trying to figure out what I typed wrong :D ).
Back to top
View user's profile Send private message
linkfromthepast
n00b
n00b


Joined: 18 Mar 2004
Posts: 23

PostPosted: Thu Apr 08, 2004 7:04 pm    Post subject: Reply with quote

modnemo : Which method/setup did you use?
Back to top
View user's profile Send private message
Merlin-TC
l33t
l33t


Joined: 16 May 2003
Posts: 603
Location: Germany

PostPosted: Sat Apr 10, 2004 5:21 am    Post subject: Reply with quote

First of all thanks a lot for that guide.
I am using a local rsync server now to sync the tree and use an nfs share for the distfiles.

What I'd like to know is if there is an option in the rsync server that let's me cache the portage tree somehow.
The machine it's on is a k6-3 450 with 256MB ram but the harddist is kinda slow so the bottleneck is the harddrive.
Is there any way to chache at least some parts of the portage tree into ram or to preread it?
Back to top
View user's profile Send private message
razamatan
Apprentice
Apprentice


Joined: 28 Feb 2003
Posts: 160

PostPosted: Fri Apr 23, 2004 2:17 am    Post subject: Reply with quote

what if i want to synchronize /usr/local/portage (a portage overlay)? i tend to roll my own ebuilds, so i'd like to sync this directory (and *just* this directory) between two machines.
_________________
a razamatan doth speaketh,
"Never attribute to malice, that which can be adequately explained by stupidity"
Back to top
View user's profile Send private message
freshy98
Apprentice
Apprentice


Joined: 11 Jul 2002
Posts: 274
Location: The Netherlands

PostPosted: Sun Apr 25, 2004 8:16 pm    Post subject: Reply with quote

linkfromthepast wrote:
Which part are you having problems with?

1.) Perl script goes in a directory on your server which you have set to execute scripts in the apache.conf.
2.) Change the variables in the Perl script based on your setup.
3.) Put the .htaccess in the folder where all your dist files will live.

So when a client requests a file in that directory, and the file does not exist, the .htaccess is used, which calls the script to download the file, then redirects the client to the file to download. Sorry for the run on :)

Also, one thing I've noticed is if you don't have a bandwidth, either Perl or Apache stalls the wget process. I'm leaning towards Apache because it actually owns the wget process, but not quite sure. So as long as you can download your dist file in under ~30secs you'll be ok. Although larger dist files like those for KDE should probably be done manually until the problem is fixed.

If that is still to broad an explanation, please post some specific questions. Good luck :)


Could you please explain where to put the files? what is /dist? does it need be named that way, or is it an example?

In the Perl script you talk about
Code:
#used to redirect client to new localtion of file
$address="http:///your.address/dir";

while a little bit further you talk about
Code:
#directory to put the new dist files in
$wgetputdir = "/var/www/localhost/htdocs/distfiles";

Isn't the /dir from the address line the /distfiles from the wgetputdir line?
It is very confusing.

Please try to explain a bit more thorough and use examples from your own sytem(s).

Thnx
_________________
Mac Pro single quad 2.8GHz, 6GB RAM, 8800GT. MacBook. Plus way too many SUN/Cobatl/SGI and a lonely Alpha.
Back to top
View user's profile Send private message
arkane
l33t
l33t


Joined: 30 Apr 2002
Posts: 918
Location: Phoenix, AZ

PostPosted: Mon Apr 26, 2004 5:02 am    Post subject: Reply with quote

razamatan wrote:
what if i want to synchronize /usr/local/portage (a portage overlay)? i tend to roll my own ebuilds, so i'd like to sync this directory (and *just* this directory) between two machines.


rsync --rsh="ssh -C" -plarvz username@servermachine:/usr/local/portage /usr/local/portage

That'd do it... set-up the SSH keys between the machines if you want to automate it. When you say *just* this directory, you mean not the subdirectories of it? Because IMHO that'd be pointless....
Back to top
View user's profile Send private message
razamatan
Apprentice
Apprentice


Joined: 28 Feb 2003
Posts: 160

PostPosted: Mon Apr 26, 2004 5:11 am    Post subject: Reply with quote

arkane wrote:
razamatan wrote:
what if i want to synchronize /usr/local/portage (a portage overlay)? i tend to roll my own ebuilds, so i'd like to sync this directory (and *just* this directory) between two machines.


rsync --rsh="ssh -C" -plarvz username@servermachine:/usr/local/portage /usr/local/portage

That'd do it... set-up the SSH keys between the machines if you want to automate it. When you say *just* this directory, you mean not the subdirectories of it? Because IMHO that'd be pointless....


yes, recursive, cus it'd be pointless otherwise... :wink:

i tried this method, but it doesn't work..

Code:
rsync --rsh="ssh -C" -uavz username@servermachine:/usr/local/portage/ /usr/local/portage


it complains about permissions (writing locally), but i have write perms via group membership..[/code]
_________________
a razamatan doth speaketh,
"Never attribute to malice, that which can be adequately explained by stupidity"
Back to top
View user's profile Send private message
linkfromthepast
n00b
n00b


Joined: 18 Mar 2004
Posts: 23

PostPosted: Mon Apr 26, 2004 5:34 pm    Post subject: Reply with quote

You are correct, $address is the actual web address of the directory. $wgetputdir is the absolute filesystem path for the directory represented by $address. For example, /var/www/localhost/htdocs/distfiles would be the absolute path. But Apache is configured for /var/www/localhost/htdocs to be the root. So http://your.address/dir would be /var/www/localhost/htdocs/dir. You can change to name to whatever you like.

The overall purpose of this is so that when a client requests a file in http://your.address/dir and the file does not exist, the Perl script downloads the file and tells the client to try again now that the file has been downloaded. This is what the .htaccess file is for.
Back to top
View user's profile Send private message
linkfromthepast
n00b
n00b


Joined: 18 Mar 2004
Posts: 23

PostPosted: Mon Apr 26, 2004 5:48 pm    Post subject: Reply with quote

Also keep in mind that all web addresses are relative to the root of the web server.

#used to redirect client to new localtion of file
$address="http:///your.address/dir";
#location of wget used to get files
$wgetlocation = "/usr/bin/wget";
#switches passed to wget
$wgetswitches = "-nc -c -t 5 --passive-ftp ";
#directory to put the new dist files in
$wgetputdir = "/var/www/localhost/htdocs/distfiles";
$distdir = "distfiles/";
#mirrors to use to get the gentoo files from
@mirrors = ("ftp://ibiblio.org/pub/Linux/distributions/gentoo/"," ftp://mirror.iawnet.sandia.gov/pub/gentoo/","ftp://gentoo.ccccom.com"," http://128.213.5.34/gentoo/");

$address, $distdir are relative paths
$wgetlocation, $wgetputdir are absolute paths

You might notice that $distdir isn't needed, so you don't need to configure it. I'm not sure why I left it in the script.

$address should be configured with the web address the client will use for trying to download the file. You can test this w/ a web browser.

$wgetputdir is the location wget puts the files it downloads. So if a client requests http://127.0.0.1/gentoo-files/x.y.z.tar.gz then wget will download the file into the $wgetputdir directory.

Example:

$address = http://127.0.0.1/distfiles
$wgetputdir = /var/www/localhost/htdocs/distfiles

One last note, you need to change the @mirrors servers to the servers that are the fastest for you. These servers may not be the quickest, they're just default servers I plucked from the /etc/make.conf
Back to top
View user's profile Send private message
freshy98
Apprentice
Apprentice


Joined: 11 Jul 2002
Posts: 274
Location: The Netherlands

PostPosted: Wed Apr 28, 2004 8:54 am    Post subject: Reply with quote

ok, let me see if I get this right.
instead of /usr/portage/distfiles I now need a /var/www/localhost/htdocs/distfiles which holds the .htaccess.

I think I will make a /var/www/localhost/htdocs/distfiles a symlink to /usr/portage/distfiles.
_________________
Mac Pro single quad 2.8GHz, 6GB RAM, 8800GT. MacBook. Plus way too many SUN/Cobatl/SGI and a lonely Alpha.
Back to top
View user's profile Send private message
freshy98
Apprentice
Apprentice


Joined: 11 Jul 2002
Posts: 274
Location: The Netherlands

PostPosted: Wed Apr 28, 2004 12:35 pm    Post subject: Reply with quote

linkfromthepast, it does not seem to work for me. I editited the perl script to my liking but when I do a emerge -f package for example, it just freezes.

I have this on my portage gateway (192.168.1.20):
Code:
GENTOO_MIRROS="http://192.168.1.20/distfiles/"
SYNC="rsync://192.168.1.20/gentoo-portage"
PORTDIR=/usr/portage
DISTDIR=/usr/portage/distfiles
PKGDIR=/usr/portage/packages

plus I have a symlink that tells /var/www/localhost/htdocs/distfiles is actually /usr/portage/distfiles. Here I also have the .htaccess file.

The perl script is in /dist/dist.pl.
Code:
$address="http:///192.168.1.20/distfiles";
$wgetputdir = "/var/www/localhost/htdocs/distfiles";
@mirrors = ("ftp.easynet.nl/mirror/gentoo/","ftp.snt.utwente.nl/pub/os/linux/gentoo/","etc,etc");



On the client machine I have this in /etc/make.conf:
Code:
GENTOO_MIRRORS="http://192.168.1.20:8080/distfiles/"
SYNC="rsync://192.168.1.20/gentoo-portage"
PORTDIR=/usr/portage
DISTDIR=${PORTDIR}/distfiles


/usr/portage/distfiles is shared via nfs on the portage gateway and mounted on the client in /usr/portage/distfiles.

Can you help me out please?
_________________
Mac Pro single quad 2.8GHz, 6GB RAM, 8800GT. MacBook. Plus way too many SUN/Cobatl/SGI and a lonely Alpha.
Back to top
View user's profile Send private message
linkfromthepast
n00b
n00b


Joined: 18 Mar 2004
Posts: 23

PostPosted: Wed Apr 28, 2004 10:06 pm    Post subject: Reply with quote

Have you tried going to the link with a regular web browser to see what the response is (http://192.168.1.20/distfiles)? Is wget running on the gateway? Are you sure apache has write permission to the /usr/portage/distfiles directory?

BTW, I'm not sure if you noticed but $address="http:///192.168.1.20/distfiles"; shoud only have 2 //, its my mistake. And GENTOO_MIRROS="http://192.168.1.20/distfiles/" , is missing a R. :)

Let me know if/when you've tried all that and the responses.
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Documentation, Tips & Tricks All times are GMT
Goto page Previous  1, 2, 3, 4, 5, 6  Next
Page 3 of 6

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum