Gentoo Websites Logo
Go to: Gentoo Home Documentation Forums Lists Bugs Planet Store Wiki Get Gentoo!
Bug 946268 - app-misc/ollama-9999: build system changed
Summary: app-misc/ollama-9999: build system changed
Status: UNCONFIRMED
Alias: None
Product: GURU
Classification: Unclassified
Component: Package issues (show other bugs)
Hardware: All Linux
: Normal major
Assignee: Paul Zander
URL:
Whiteboard:
Keywords: EBUILD
Depends on:
Blocks: 920301 935842 948424
  Show dependency tree
 
Reported: 2024-12-11 12:30 UTC by rogerroger
Modified: 2025-02-10 09:42 UTC (History)
12 users (show)

See Also:
Package list:
Runtime testing required: ---


Attachments
potential ebuild (ollama-9999.ebuild,2.53 KB, application/vnd.gentoo.ebuild)
2024-12-11 13:07 UTC, rogerroger
Details
fix "potential ebuild" merge issue (fix_failure_on_empty_lib_dir.diff,414 bytes, patch)
2024-12-17 14:20 UTC, Spekadyon
Details | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description rogerroger 2024-12-11 12:30:37 UTC
Ollama recently merged a PR that overhauls the build system and makes it easier for distributions to provide packages. This means, however, that the current way of calling `ego generate` fails.

Ollama now provides a set of Makefiles that handle most of the build process. However, there are a few things to keep in mind:

* By default, Ollama will unconditionally build the main executable along with additional AVX and AVX2 runners.
* You may need to adjust paths to align with where Gentoo installs libraries.

I've created a new ebuild and tested various AVX and AMDGPU flags without encountering any issues.

Here is a link to the PR that may have some additional information https://github.com/ollama/ollama/pull/7499.

Reproducible: Always
Comment 1 rogerroger 2024-12-11 13:07:32 UTC
Created attachment 913808 [details]
potential ebuild

Here are the differences in the updated ebuild:

- The minimum Go version has been updated to 1.22.
- The `nvidia` and `amd` use flags have been renamed to `cuda` and `amdgpu` for 
consistency with other packages.
- `sci-libs/hipBLAS` has been added as a dependency for AMD builds.
- Amdgpu support has been fixed.

Issues with my ebuild:
- CUDA is untested as I don't have an nvidia GPU.
- The `sed` call is currently used as a workaround for a build 
issue. During compilation, the project defaults to `-Werror` and 
`-Wunused-command-line-argument`. At some point in the build process, an 
unsupported `--hip-version` flag is automatically added, leading to a compile 
failure.

Ideally, there should be a cleaner solution than the `sed` workaround, but I don't know how.

Running this on amd hardware without the patch in https://bugs.gentoo.org/935558 will crash. I have no issues running Ollama with the patch applied.
Comment 2 Viorel Munteanu gentoo-dev 2024-12-14 13:07:27 UTC
You can request access to GURU and push it yourself, please check https://wiki.gentoo.org/wiki/Project:GURU
Comment 3 Spekadyon 2024-12-17 14:18:56 UTC
The ebuild produces the following errors when compiled with USE="-cuda" or USE="cuda", and:
* AMDGPU_TARGETS="gfx90a gfx906 gfx908 gfx942 gfx1030 gfx1100 -gfx803 -gfx900 -gfx940 -gfx941 -gfx1010 -gfx1011 -gfx1012 -gfx1031 -gfx1101 -gfx1102"
* CPU_FLAGS_X86="avx avx2 -avx512"
* VIDEO_CARDS="-amdgpu"
There is no ollama directory in dist/linux-amd64/lib/ in either builds.

>>>> Install app-misc/ollama-9999 into /var/tmp/portage/app-misc/ollama-9999/image
>Traceback (most recent call last):
>  File "/usr/lib/portage/python3.12/doins.py", line 609, in <module>
>    sys.exit(main(sys.argv[1:]))
>             ^^^^^^^^^^^^^^^^^^
>  File "/usr/lib/portage/python3.12/doins.py", line 598, in main
>    if _doins(
>       ^^^^^^^
>  File "/usr/lib/portage/python3.12/doins.py", line 441, in _doins
>    return install_runner.install_file(source, os.path.dirname(dest))
>           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>  File "/usr/lib/portage/python3.12/doins.py", line 375, in install_file
>    return self._ins_runner.run(source, dest_dir)
>           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>  File "/usr/lib/portage/python3.12/doins.py", line 183, in run
>    sstat = os.stat(source)
>            ^^^^^^^^^^^^^^^
>FileNotFoundError: [Errno 2] No such file or directory: b'dist/linux-amd64/lib/ollama'

The attached patch solves the issue by dealing with the lib/ollama directory only if it exists. I don't know if it's the proper way to fix it though.
I do have a computer with an nVidia card to test the ebuilds, if needed.
Comment 4 Spekadyon 2024-12-17 14:20:02 UTC
Created attachment 914234 [details, diff]
fix "potential ebuild" merge issue
Comment 5 rogerroger 2024-12-17 14:35:49 UTC
Thanks for testing. 

I thought ollama would always create the directory and just not populate it if the runners aren't compiled. You could alternatively check for the presence of amdgpu or cuda flags, but I think your solution is cleaner.

One more issue I discovered yesterday was that files in /lib/ollama need to be installed with execute permission. For rocm there's a libggml_rocm.so and ollama_llama_server executable. What would be the appropriate way to handle that?

I guess there's install -m 755, but it seems no ebuild does it that way.
Comment 6 Paul Zander 2025-01-25 17:44:18 UTC
The ebuild misses almost all changes done in ::gentoo for cuda and rocm support. As well as being quite naive when configuring the build.
Comment 7 Egor 2025-01-26 09:06:33 UTC
I've updated 'ollama-9999.ebuild` in Guru repo based on the changes proposed in this bug.

Also can confirm that now ollama is using AMD GPU to run inference.

Probably we should also introduced ollama ebuilds for the releases to avoid breaking world update on ollama/HEAD changes.
Comment 8 rogerroger 2025-01-26 12:28:37 UTC
(In reply to Paul Zander from comment #6)
> The ebuild misses almost all changes done in ::gentoo for cuda and rocm
> support. As well as being quite naive when configuring the build.

Life has gotten in the way and I haven't had a chance to properly look into it again.



(In reply to Egor from comment #7)
> I've updated 'ollama-9999.ebuild` in Guru repo based on the changes proposed
> in this bug.
> 
> Also can confirm that now ollama is using AMD GPU to run inference.
> 
> Probably we should also introduced ollama ebuilds for the releases to avoid
> breaking world update on ollama/HEAD changes.

Are the additional ollama_llama_server binaries executable? It will work fine if they aren't and just silently not use them, but they are needed to actually utilize the GPU.


Also, changing the find line to

find "${S}" -name ".go" -exec sed -i "s/ -Werror / -Werror -Wno-unused-command-line-argument /g" {} + || die

should work as well and keep -Werror.
Comment 9 Paul Zander 2025-01-26 15:55:27 UTC
(In reply to Egor from comment #7)
> I've updated 'ollama-9999.ebuild` in Guru repo based on the changes proposed
> in this bug.
> 
> Also can confirm that now ollama is using AMD GPU to run inference.
> 
> Probably we should also introduced ollama ebuilds for the releases to avoid
> breaking world update on ollama/HEAD changes.

Please don't touch it further. There are things still incorrect in that version and I will fix all of it anyway.
Comment 10 Larry the Git Cow gentoo-dev 2025-01-27 09:57:21 UTC
The bug has been referenced in the following commit(s):

https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=ac646f42de00d0fcada1326f887692fedef16879

commit ac646f42de00d0fcada1326f887692fedef16879
Author:     Yahor Berdnikau <egorr.berd@gmail.com>
AuthorDate: 2025-01-26 09:02:59 +0000
Commit:     Yahor Berdnikau <egorr.berd@gmail.com>
CommitDate: 2025-01-26 09:02:59 +0000

    app-misc/ollama: Fixes build due to the upstream build system changes
    
    Bug: https://bugs.gentoo.org/946268
    Signed-off-by: Yahor Berdnikau <egorr.berd@gmail.com>

 app-misc/ollama/metadata.xml       |  3 +-
 app-misc/ollama/ollama-9999.ebuild | 76 +++++++++++++++++++++++++++++---------
 2 files changed, 60 insertions(+), 19 deletions(-)
Comment 11 Justin 2025-01-31 00:53:00 UTC
When emerging I receive the following error, not sure if its related to the new ebuild on the 26th

Fetching Dependencies...
google.golang.org/protobuf/runtime/protoimpl
google.golang.org/protobuf/types/descriptorpb
google.golang.org/protobuf/types/gofeaturespb
# gopkg.in/yaml.v3 v3.0.1
## explicit
gopkg.in/yaml.v3
# gorgonia.org/vecf32 v0.9.0
## explicit; go 1.13
gorgonia.org/vecf32
# gorgonia.org/vecf64 v0.9.0
## explicit; go 1.13
gorgonia.org/vecf64
>>> Source unpacked in /tmp/portage/app-misc/ollama-9999/work
>>> Preparing source in /tmp/portage/app-misc/ollama-9999/work/ollama-9999 ...
>>> Source prepared.
>>> Configuring source in /tmp/portage/app-misc/ollama-9999/work/ollama-9999 ...
>>> Source configured.
>>> Compiling source in /tmp/portage/app-misc/ollama-9999/work/ollama-9999 ...
make -j8 -l8 dist
make: *** No rule to make target 'dist'.  Stop.
 * ERROR: app-misc/ollama-9999::guru failed (compile phase):
 *   emake failed
 *
 * If you need support, post the output of `emerge --info '=app-misc/ollama-9999::guru'`,
 * the complete build log and the output of `emerge -pqv '=app-misc/ollama-9999::guru'`.
 * The complete build log is located at '/tmp/portage/app-misc/ollama-9999/temp/build.log'.
 * The ebuild environment file is located at '/tmp/portage/app-misc/ollama-9999/temp/environment'.
 * Working directory: '/tmp/portage/app-misc/ollama-9999/work/ollama-9999'
 * S: '/tmp/portage/app-misc/ollama-9999/work/ollama-9999'

>>> Failed to emerge app-misc/ollama-9999, Log file:

>>>  '/tmp/portage/app-misc/ollama-9999/temp/build.log'

 * Messages for package app-misc/ollama-9999:

 * ERROR: app-misc/ollama-9999::guru failed (compile phase):
 *   emake failed
 *
 * If you need support, post the output of `emerge --info '=app-misc/ollama-9999::guru'`,
 * the complete build log and the output of `emerge -pqv '=app-misc/ollama-9999::guru'`.
 * The complete build log is located at '/tmp/portage/app-misc/ollama-9999/temp/build.log'.
 * The ebuild environment file is located at '/tmp/portage/app-misc/ollama-9999/temp/environment'.
 * Working directory: '/tmp/portage/app-misc/ollama-9999/work/ollama-9999'
 * S: '/tmp/portage/app-misc/ollama-9999/work/ollama-9999'
Comment 12 norbiros 2025-01-31 09:05:37 UTC
Yeah, the build system was changed once again in this commit https://github.com/ollama/ollama/commit/dcfb7a105c455ae8d44a06b3380731d8b1ffcc22
Comment 13 Egor 2025-01-31 09:41:42 UTC
> Are the additional ollama_llama_server binaries executable? It will work fine if they aren't and just silently not use them, but they are needed to actually utilize the GPU.

Yes, there is '/usr/lib/ollama/runners/rocm/ollama_llama_server' which is not executable, but 'ollama ps' shows me that inference is running on GPU and the speed of it is faster than on my CPU.

> Yeah, the build system was changed once again in this commit

That is why we also need to introduce an ebuild for Ollama releases.

Paul Zander, if you don't mind - I could convert existing 9999 ebuild to use latest Ollama release 0.5.7 and you will update it eventually with your desired changes.
Comment 14 Paul Zander 2025-01-31 16:28:37 UTC
(In reply to Egor from comment #13)
> Paul Zander, if you don't mind - I could convert existing 9999 ebuild to use
> latest Ollama release 0.5.7 and you will update it eventually with your
> desired changes.

Please do, you could just add a 0.5.7 release. Otherwise I'll do it in the next 6 hours. After I've finished the cuda-12.8.0 stuff.

I am on irc as negril. If you need to find me.
Comment 15 Egor 2025-01-31 18:42:48 UTC
Pushed the ebuild for ollama-0.5.7 release into guru/dev - it is based on exiting 9999 one.
Comment 16 Larry the Git Cow gentoo-dev 2025-02-01 16:10:16 UTC
The bug has been referenced in the following commit(s):

https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=1879435c4a4365e815a275c8c40069c7b5f99b9a

commit 1879435c4a4365e815a275c8c40069c7b5f99b9a
Author:     Yahor Berdnikau <egorr.berd@gmail.com>
AuthorDate: 2025-01-31 19:13:09 +0000
Commit:     Yahor Berdnikau <egorr.berd@gmail.com>
CommitDate: 2025-01-31 19:13:09 +0000

    app-misc/ollama: keyword 0.5.7 for ~amd64
    
    Bug: https://bugs.gentoo.org/946268
    Signed-off-by: Yahor Berdnikau <egorr.berd@gmail.com>

 app-misc/ollama/ollama-0.5.7.ebuild | 2 ++
 1 file changed, 2 insertions(+)

https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=bfaa93dcace907596bee18483e11f9831f5d6cf9

commit bfaa93dcace907596bee18483e11f9831f5d6cf9
Author:     Yahor Berdnikau <egorr.berd@gmail.com>
AuthorDate: 2025-01-31 18:40:07 +0000
Commit:     Yahor Berdnikau <egorr.berd@gmail.com>
CommitDate: 2025-01-31 18:40:59 +0000

    app-misc/ollama: add 0.5.7
    
    Bug: https://bugs.gentoo.org/946268
    Signed-off-by: Yahor Berdnikau <egorr.berd@gmail.com>

 app-misc/ollama/Manifest            |   2 +
 app-misc/ollama/ollama-0.5.7.ebuild | 113 ++++++++++++++++++++++++++++++++++++
 2 files changed, 115 insertions(+)
Comment 17 James McGeehan IV 2025-02-01 22:32:42 UTC
Testing out the 0.5.7 ebuild resulted in being unable to utilize the GPU properly.

`/usr/lib/ollama/runners/rocm/ollama_llama_server` needs to be executable.

See: https://github.com/ollama/ollama/issues/8743
Comment 18 Larry the Git Cow gentoo-dev 2025-02-10 09:42:07 UTC
The bug has been referenced in the following commit(s):

https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=e39750a3bb6ffeb16ff7ef57cb80a5948c6f753f

commit e39750a3bb6ffeb16ff7ef57cb80a5948c6f753f
Author:     Paul Zander <negril.nx+gentoo@gmail.com>
AuthorDate: 2025-02-09 17:40:51 +0000
Commit:     Paul Zander <negril.nx+gentoo@gmail.com>
CommitDate: 2025-02-09 17:40:59 +0000

    app-misc/ollama: mark executables as exacutable
    
    The build system has exectuables in /usr/lib/ollama/runners/rocm/ ...
    
    Bug: https://bugs.gentoo.org/946268#c17
    Signed-off-by: Paul Zander <negril.nx+gentoo@gmail.com>

 app-misc/ollama/ollama-0.5.7-r1.ebuild | 4 ++++
 1 file changed, 4 insertions(+)