For users with no / a slow internet connection I have written a script for downloading the necessary packages on another computer. It saves the adresses to download in the file 'Files' on a portable medium. The script files are 'emerge-zip', 'fetch-files' and 'move-files'. Usage example: slow_host > emerge -a gnome # just to look, what has to be downloaded slow_host > emerge-zip gnome # save the necessary files fast_host > fetch-files # on the computer with a fast internet connection; this fetches the files with 'wget' slow_host > move-files # again on the first computer, moves the files from the media to the harddisk slow_host > install # installs 'gnome' the file 'install' will be creaed by the script. vvv emerge-zip vvv #!/bin/bash # media to mount MEDIA=/media/usbstick # directory to save the file list to DIR=${MEDIA}/portage # directory with the distfiles DISTFILES_DIR=/usr/portage/distfiles # directory with the script filesj SCRIPTS_DIR=~/bin # mount the media mount ${MEDIA} 2>/dev/null mkdir -p ${DIR}/distfiles # check for write access touch ${DIR}/distfiles/Files \ || ( echo "cannot touch '${DIR}/distfiles/Files'"; \ echo "make sure that '${DIR}/distfiles' is a directory";\ echo "exiting"; \ false ) || exit # echo the prompt echo "emerge-zip $@" # save the packages and the command echo "$@" >> ${DIR}/packages echo "echo emerge -a $@ && emerge -a $@" >> ${DIR}/install # Checks whether the files exists in the local distfiles directory # If not, then the internet addresses are added to the 'Files' file # params: internet addresses of the files function check_for_files() { if [ -f ${DISTFILES_DIR}/$(basename $1) ]; then return; fi # print the parameters at one line each for f in $@; do echo $f; done } # function check_for_files() # Create the list of files to be downloaded. # Then split the emerge output into lines and for each line (= file) check # whether the file is already on the system. Else save the file in 'Files' emerge -fp "$@" 2>&1 >/dev/null \ | grep -v "^$" \ | while read line; do check_for_files $line done \ >> ${DIR}/Files # copy the fetch and move scripts cp ${SCRIPTS_DIR}/emerge-zip ${DIR}/ cp ${SCRIPTS_DIR}/fetch-files ${DIR}/ cp ${SCRIPTS_DIR}/move-files ${DIR}/ # print some statistics echo -n "'Files' has $(wc -l < ${DIR}/Files) lines, " echo -n "$(sort -u ${DIR}/Files | wc -l) locations " echo "and $(for f in `sort -u ${DIR}/Files`; do basename $f; done | sort -u | wc -l) files." ^^^ emerge-zip ^^^ vvv fetch-files vvv #!/bin/sh # fetch the latest portage directory (use a mirror near your location) wget -nc -nd ftp://ftp.tu-clausthal.de/pub/linux/gentoo/snapshots/portage-latest.tar.bz2 # fetch the files from the filelist wget -nc -nd -i Files -P distfiles ^^^ fetch-files ^^^ vvv move-files vvv #!/bin/sh # directory with the portage PORTAGE_DIR=/usr/portage mv -v portage-latest* ${PORTAGE_DIR}/ mv -v distfiles/* ${PORTAGE_DIR}/distfiles/ chown -R portage:portage ${PORTAGE_DIR}/portage-latest* ${PORTAGE_DIR}/distfiles/ ^^^ move-files ^^^
can you please attach the scripts to the bug instead of having them inline?
Created attachment 81798 [details] main script
Created attachment 81799 [details] fetches the files
Created attachment 81800 [details] moves the fetched files to the correct location
What's wrong with just savng the `emerge -fp` output to a file and feed that file to wget? Anyway, would be a gentoolkit thing.
`emerge -fp` prints all files. My script filters the ones that are not already in the local directory (see the 'check_for_files' function).