Summary: | [science-overlay] sys-cluster/mpich2-1.0.8-r1 installs py{c,o} files in /usr/bin/ | ||
---|---|---|---|
Product: | Gentoo Linux | Reporter: | Martin Mokrejš <mmokrejs> |
Component: | New packages | Assignee: | Gentoo Linux bug wranglers <bug-wranglers> |
Status: | RESOLVED INVALID | ||
Severity: | normal | CC: | hp-cluster |
Priority: | High | ||
Version: | unspecified | ||
Hardware: | All | ||
OS: | Linux | ||
Whiteboard: | |||
Package list: | Runtime testing required: | --- |
Description
Martin Mokrejš
2009-06-04 20:41:21 UTC
Oh no, maybe that is the old issue with these two packages clashing, see related bug #248307: # equery belongs /usr/bin/mpiexec * Searching for /usr/bin/mpiexec ... sys-cluster/mpiexec-0.82-r1 (/usr/bin/mpiexec) sys-cluster/mpich2-1.0.8-r1 (/usr/bin/mpiexec -> mpiexec.py) # That's how python compiling works. If, after unmerging mpich2, the .py{c,o} files are still there, please reopen this bug. nfssrv mpiexec # ls -la /usr/bin/mpiexec* -rwxr-xr-x 1 root root 47928 Jun 4 22:38 /usr/bin/mpiexec.gforker -rwxr-xr-x 1 root root 60173 Jun 4 22:38 /usr/bin/mpiexec.py -rw-r--r-- 1 root root 42201 May 19 2008 /usr/bin/mpiexec.pyc -rw-r--r-- 1 root root 42201 May 19 2008 /usr/bin/mpiexec.pyo nfssrv mpiexec # ls -la /usr/bin/mpirun* lrwxrwxrwx 1 root root 7 Jun 4 22:38 /usr/bin/mpirun -> mpiexec lrwxrwxrwx 1 root root 10 Jun 4 22:38 /usr/bin/mpirun.py -> mpiexec.py -rw-r--r-- 1 root root 42200 May 19 2008 /usr/bin/mpirun.pyc -rw-r--r-- 1 root root 42200 May 19 2008 /usr/bin/mpirun.pyo nfssrv mpiexec # The above symlink from mpirun to a non-existing file happened after I unmerged sys-cluster/mpiexec and installed sys-cluster/osc-mpiexec. I did re-install mpich2 several times today as can be seen by the date&time. Anyway, I did what you asked for: nfssrv mpiexec # emerge --unmerge mpich2 sys-cluster/mpich2 selected: 1.0.8-r1 protected: none omitted: none >>> 'Selected' packages are slated for removal. >>> 'Protected' and 'omitted' packages will not be removed. >>> Waiting 5 seconds before starting... >>> (Control-C to abort)... >>> Unmerging in: 5 4 3 2 1 >>> Unmerging sys-cluster/mpich2-1.0.8-r1... * GNU info directory index is up-to-date. * IMPORTANT: 1 news items need reading for repository 'gentoo'. * Use eselect news to read news items. nfssrv mpiexec # ls -la /usr/bin/mpirun* -rw-r--r-- 1 root root 42200 May 19 2008 /usr/bin/mpirun.pyc -rw-r--r-- 1 root root 42200 May 19 2008 /usr/bin/mpirun.pyo nfssrv mpiexec # ls -la /usr/bin/mpiexec* -rw-r--r-- 1 root root 42201 May 19 2008 /usr/bin/mpiexec.pyc -rw-r--r-- 1 root root 42201 May 19 2008 /usr/bin/mpiexec.pyo nfssrv mpiexec # I suspect these are stale after sys-cluster/mpiexec ebuild instead. (In reply to comment #3) > nfssrv mpiexec # ls -la /usr/bin/mpirun* > -rw-r--r-- 1 root root 42200 May 19 2008 /usr/bin/mpirun.pyc > -rw-r--r-- 1 root root 42200 May 19 2008 /usr/bin/mpirun.pyo > nfssrv mpiexec # ls -la /usr/bin/mpiexec* > -rw-r--r-- 1 root root 42201 May 19 2008 /usr/bin/mpiexec.pyc > -rw-r--r-- 1 root root 42201 May 19 2008 /usr/bin/mpiexec.pyo > nfssrv mpiexec # > > > I suspect these are stale after sys-cluster/mpiexec ebuild instead. > No, you were right, were from mpich2. nfssrv mpiexec # strings /usr/bin/mpirun.pyo usage: mpiexec [-h or -help or --help] # get this message mpiexec -file filename # (or -f) filename contains XML job description mpiexec [global args] [local args] executable [args] where global args may be -l # line labels by MPI rank -bnr # MPICH1 compatibility mode -machinefile # file mapping procs to machines -s <spec> # direct stdin to "all" or 1,2 or 2-4,6 -1 # override default of trying 1st proc locally -ifhn # network interface to use locally -tv # run procs under totalview (must be installed) -tvsu # totalview startup only -gdb # run procs under gdb -m # merge output lines (default with gdb) -a # means assign this alias to the job -ecfn # output_xml_exit_codes_filename -g<local arg name> # global version of local arg (below) and local args may be -n <n> or -np <n> # number of processes to start -wdir <dirname> # working directory to start in -umask <umask> # umask for remote process -path <dirname> # place to look for executables -host <hostname> # host to start on -soft <spec> # modifier of -n value -arch <arch> # arch type to start on (not implemented) -envall # pass all env vars in current environment -envnone # pass no env vars -envlist <list of env var names> # pass current values of these vars -env <name> <value> # pass this value of this env var mpiexec [global args] [local args] executable args : [local args] executable... mpiexec -gdba jobid # gdb-attach to existing jobid mpiexec -configfile filename # filename contains cmd line segs as lines (See User Guide for more details) Examples: mpiexec -l -n 10 cpi 100 mpiexec -genv QPL_LICENSE 4705 -n 3 a.out mpiexec -n 1 -host foo master : -n 4 -host mysmp slave ctimes Ralph Butler and Rusty Lusks $Revision: 1.90 $t [cut] nfssrv mpiexec # file /usr/bin/mpirun.pyc /usr/bin/mpirun.pyc: python 2.4 byte-compiled nfssrv mpiexec # file /usr/bin/mpirun.pyo /usr/bin/mpirun.pyo: python 2.4 byte-compiled nfssrv mpiexec # I suspect it could be that once I compiled&installed mpich2 myself. If you see no explanation how that could have happened close as INVALID, I cannot exclude my fault. I'm not seeing it locally. Closing as the above comment suggested. |