Gentoo Websites Logo
Go to: Gentoo Home Documentation Forums Lists Bugs Planet Store Wiki Get Gentoo!
Bug 358923 - [Suggestion] Make an option to output URI list for an emerge -f
Summary: [Suggestion] Make an option to output URI list for an emerge -f
Status: IN_PROGRESS
Alias: None
Product: Portage Development
Classification: Unclassified
Component: Core - Interface (emerge) (show other bugs)
Hardware: All Linux
: Normal enhancement (vote)
Assignee: Portage team
URL:
Whiteboard:
Keywords:
: 399109 587488 (view as bug list)
Depends on:
Blocks: 377365 586152
  Show dependency tree
 
Reported: 2011-03-14 18:35 UTC by David Carlos Manuelda
Modified: 2018-02-02 01:12 UTC (History)
5 users (show)

See Also:
Package list:
Runtime testing required: ---


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description David Carlos Manuelda 2011-03-14 18:35:14 UTC
There are times (specially when machine is not connected to the internet), that would be usefull to get all URIs (output to STDOUT) to download packages by hand.
Currently, I need to write down each URI by hand, but an option for that would be useful IMHO.
Example:
emerge -f foo1 foo2 --URIlist
http://www.foo1.com/distfiles/foo1-xxx.tar.bz2
http://www.foo1.com/distfiles/foo1-xxx-do-something.patch
http://www.foo2.com/distfiles/foo2-xxx.tar.bz2

I know it would not be useful for everybody, but I wanted to share this idea anyway.
This way, I could download that packages from other place, and just place them in /usr/portage/distfiles

What about it?

Reproducible: Always
Comment 1 David Carlos Manuelda 2011-03-14 18:35:59 UTC
Even, this option could imply -f automatically (as --skip-first implies --resume for example).
Comment 2 Zac Medico gentoo-dev 2011-03-14 18:41:05 UTC
I think emerge --pretend --fetch does this already, but maybe we need to document it.
Comment 3 David Carlos Manuelda 2011-03-14 22:10:09 UTC
But I only need to fetch 1 URI from each package, not all URIs from all packages.
Comment 4 Zac Medico gentoo-dev 2011-03-14 22:23:43 UTC
Another problem with the existing output is that it doesn't account for SRC_URI arrows that have been supported since EAPI 2. So, you might have trouble getting your files to have the correct names.
Comment 5 David Carlos Manuelda 2011-03-14 23:00:42 UTC
Another proposal:
It could be interesting too, to dump a list of portage's atoms to be emerged if another switch is active, interesting for downloading that software on another gentoo machine.
In this case, only names with versions will be stored with only needed dependencies for that system, and not for system you are downloading in.
This, combined with another option to import this file, and completelly ignore dependency check would be a solution.
I saw some time ago, something like that in the old paludis client, which in case of failure had a line containing every packages that failed with its deps, and skipping dependency check.

Example:
emerge foo --PROPOSED_EXPORT_OPTION
=dep1-xxxx =dep2-xxxx =dep3-xxxx =foo-xxxx

emerge -f --PROPOSED_IMPORT_OPTION =dep1-xxxx =dep2-xxxx =dep3-xxxx =foo-xxxx

Now, imagine that in target system, dep1 also depends on bar1, but in system you want to download packages, bar1 is NOT installed.
Then, it will need to skip dependency check in order not to pull and download bar1, which is not really needed by target system.

It also could disable keyword checking, arch checking and that plattform specific things, and only do what passed as an argument.

To be safe, I suggest to force and imply the -f option.
Comment 6 David Carlos Manuelda 2011-03-15 07:50:12 UTC
(In reply to comment #4)
> Another problem with the existing output is that it doesn't account for SRC_URI
> arrows that have been supported since EAPI 2. So, you might have trouble
> getting your files to have the correct names.

Maybe a good solution for that would be to have (an option or default behavior) a wget friendly output.
1 line per each file
Comment 7 David Carlos Manuelda 2011-03-30 11:04:35 UTC
And its behavior can be to check it is downloadable before outputting to file.

Example.

emerge -f foo --URIlist --outputpath:/media/pendrive
wget http://foo.com/foo.tar.bz2 -O /media/pendrive/foo-1.2.3.tar.bz2
wget http://foo.com/foo-patches.tar.bz2 -O /media/pendrive/foo-patches-1.2.3.tar.bz2

That way, it could handle => operator too, and possibly combining with --emptytree, you can have everything downloaded.

As I said before, it could be useful in cases where the system where you execute this command have different keywords (or even different distro) than the one you will download software.
Comment 8 Zac Medico gentoo-dev 2012-01-16 19:45:57 UTC
*** Bug 399109 has been marked as a duplicate of this bug. ***
Comment 9 Zac Medico gentoo-dev 2016-06-29 08:26:34 UTC
*** Bug 587488 has been marked as a duplicate of this bug. ***
Comment 10 Martin Mokrejš 2016-06-29 08:33:16 UTC
Also, on the networked host, it could try to fstat() the files to be downloaded in /usr/portage/distfiles and if they are locally available, copy them to the pendriv, if not, then do the real network download.

I could of course use basename to mangle the URLs and lookup the files through a shell loop, and run wget only if they are not locally available.
Comment 11 Kete Tefid 2016-07-03 07:25:07 UTC
emerge -pf package
already lists the URIs. There are two main problems:
1) It does not have an option for listing one URI per package and for omitting extra output (like :These are the packages...) so that it is ready to be imported into download managers with ease (this bug/suggestion).
2) As it is explained in this bug:
https://bugs.gentoo.org/show_bug.cgi?id=587400
it would be super cool if emerge -pf checks DISTDIR for already downloaded files. Currently it does not check and lists all of the needed files for download regardless of the check if they have already been downloaded or not. 
It is clear that the second option has higher priority over the first one. 
This option has not been implemented in portage yet, and because of that, I and a lot of other users put much pressure on download servers for downloading already downloaded files.