Since the update to media-libs/vulkan-layers-1.3.239 vulkan is broken if validation layers are loaded: ``` Enable Layers: - VK_LAYER_KHRONOS_validation Exception: VK_ERROR_LAYER_NOT_PRESENT Source File: src/shared/vulkan/src/devkInstance.cpp ``` vulkan-layers does install the layer: ``` > vulkaninfo | grep VK_LAYER_KHRONOS_validation VK_LAYER_KHRONOS_validation (Khronos Validation Layer) Vulkan version 1.3.239, layer version 1: ``` But enabling it fails. Is there some kind of debug environment variable that can be set to figure out where vulkan-layers fails to build/install the layer properly?
Hi, Thanks for the bug report. Can you post some repro steps that show enabling the layer fails? Like maybe running some app with some options? vkcube --validate seems to work, but maybe that's not doing the right thing. Thanks, Sarnex
I downloaded the vkcube repository and build it but the current version seems to not know "--validate" anymore.
(In reply to Plüss Roland from comment #2) > I downloaded the vkcube repository and build it but the current version > seems to not know "--validate" anymore. How did you get the original error?
https://github.com/LordOfDragons/dragengine but you would need to create a debug build ( scons with_debug=yes build_igde=no prefix=/opt/tempbuild ) since validation layer is only enabled in debug build. You could then use for example https://github.com/LordOfDragons/deexamples/releases/download/v1.9/DSTestProject.delga to run . Any DELGA is fine since the problem is logged when the engine starts up. Run "delauncher DSTestProject.delga" and find the logs at ~/.config/delauncher/logs/games/9febef48-f384-4ca2-899e-ae8efdaf996d/logs/last_run.log . Search for "Enable Layers:" and you should be right at the top of the thrown exception. this requires extension "VK_EXT_debug_report" and layer "VK_LAYER_KHRONOS_validation" to be present otherwise the code is not run.
I keep getting this error when trying to build. As it seems this is your repository, can you take a look? Note with_system_openal=yes or with_system_openal=auto also doesn't work, same error. Thanks scons with_debug=yes with_system_openal=no prefix=/home/sarnex/Desktop/here ... use system openal | no ... Build OpenAL scons: *** [extern/openal/build/lib/libopenal.a] ReadError : file could not be opened successfully: - method gz: ReadError('not a gzip file') - method bz2: ReadError('not a bzip2 file') - method xz: ReadError('not an lzma file') - method tar: ReadError('truncated header') Traceback (most recent call last): File "/usr/lib/python3.10/site-packages/SCons/Action.py", line 1310, in execute result = self.execfunction(target=target, source=rsources, env=env) File "/home/sarnex/Desktop/dragengine/extern/openal/SConscript", line 67, in openalBuild untarArchive(buildDir.abspath, source[0].abspath) File "/home/sarnex/Desktop/dragengine/SConsCommon.py", line 49, in untarArchive tf = tarfile.open(source, 'r') File "/usr/lib/python3.10/tarfile.py", line 1639, in open raise ReadError(f"file could not be opened successfully:\n{error_msgs_summary}") tarfile.ReadError: file could not be opened successfully: - method gz: ReadError('not a gzip file') - method bz2: ReadError('not a bzip2 file') - method xz: ReadError('not an lzma file') - method tar: ReadError('truncated header') scons: building terminated because of errors.
This is be due to repository bandwidth restriction on GitHub. Run "git lfs pull" before building. Otherwise all binary files are placeholders :(
Ah ok, you might want to consider adding that to the doc.
Hm, strange. I checked with github settings and git config and they are correct. There should be no problem. I did a fresh clone right now and all the lfs files are downloaded during the clone. No idea why it did not work correctly. I've though updated the ReadMe in case GitHub causes problems again.
The problem was I didn't have git lfs installed at all. With it installed and set up, git cloning and build works fine without git lfs pull. Maybe just add git lfs to the dependencies. Anyway, I'll try to look at the real issue soon.
Didn't think about that. Updated the ReadMe with the Git Lfs dependency.
Finally got back to this. I can repro the issue on 1.3.239 but it works on 1.3.243 (current unstable). Can you try the 1.3.243 stack and see if it works for you as well? If so we can consider stabilizing 1.3.243 a bit early. Thanks
Yes, 1.3.243 does not show the error anymore.
Will be fixed when we stabilize. Thanks for the report. *** This bug has been marked as a duplicate of bug 904477 ***
Will call it a depends on instead
Ah yeah that's better thx