Go to:
Gentoo Home
Documentation
Forums
Lists
Bugs
Planet
Store
Wiki
Get Gentoo!
Gentoo's Bugzilla – Attachment 612782 Details for
Bug 708726
dev-perl/MogileFS-Server-2.730.0 : Error: Crashed on purpose at lib/MogileFS/Worker/Query.pm line 186.
Home
|
New
–
[Ex]
|
Browse
|
Search
|
Privacy Policy
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
[x]
|
Forgot Password
Login:
[x]
dev-perl:MogileFS-Server-2.730.0:20200208-121709.log
dev-perl:MogileFS-Server-2.730.0:20200208-121709.log (text/plain), 71.48 KB, created by
Toralf Förster
on 2020-02-08 13:18:40 UTC
(
hide
)
Description:
dev-perl:MogileFS-Server-2.730.0:20200208-121709.log
Filename:
MIME Type:
Creator:
Toralf Förster
Created:
2020-02-08 13:18:40 UTC
Size:
71.48 KB
patch
obsolete
> * Package: dev-perl/MogileFS-Server-2.730.0 > * Repository: gentoo > * Maintainer: robbat2@gentoo.org perl@gentoo.org > * USE: abi_x86_64 amd64 elibc_glibc kernel_linux postgres sqlite test userland_GNU > * FEATURES: network-sandbox preserve-libs sandbox test userpriv usersandbox > * Adding user 'mogile' to your system ... > * - Userid: 460 > * - Shell: /sbin/nologin > * - Home: /dev/null > * - Groups: (none) > * - GECOS: added by portage for MogileFS-Server >>>> Unpacking source... >>>> Unpacking MogileFS-Server-2.73.tar.gz to /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work >>>> Source unpacked in /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work >>>> Preparing source in /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work/MogileFS-Server-2.73 ... > * Applying MogileFS-Server-2.720.0-gentoo-init-conf.patch ... > [ ok ] >>>> Source prepared. >>>> Configuring source in /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work/MogileFS-Server-2.73 ... > * Using ExtUtils::MakeMaker > * perl Makefile.PL PREFIX=/usr INSTALLDIRS=vendor INSTALLMAN3DIR=none DESTDIR=/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/ >Checking if your kit is complete... >Looks good >Generating a Unix-style Makefile >Writing Makefile for MogileFS::Server >Writing MYMETA.yml and MYMETA.json >>>> Source configured. >>>> Compiling source in /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work/MogileFS-Server-2.73 ... > * emake OTHERLDFLAGS=-Wl,-O1 -Wl,--as-needed >make -j1 'OTHERLDFLAGS=-Wl,-O1 -Wl,--as-needed' >cp lib/MogileFS/ConnectionPool.pm blib/lib/MogileFS/ConnectionPool.pm >cp lib/MogileFS/Store/MySQL.pm blib/lib/MogileFS/Store/MySQL.pm >cp lib/MogileFS/Connection/Parent.pm blib/lib/MogileFS/Connection/Parent.pm >cp lib/MogileFS/Connection/Worker.pm blib/lib/MogileFS/Connection/Worker.pm >cp lib/MogileFS/ReplicationRequest.pm blib/lib/MogileFS/ReplicationRequest.pm >cp lib/MogileFS/Store/SQLite.pm blib/lib/MogileFS/Store/SQLite.pm >cp lib/MogileFS/Device.pm blib/lib/MogileFS/Device.pm >cp lib/MogileFS/Connection/HTTP.pm blib/lib/MogileFS/Connection/HTTP.pm >cp lib/MogileFS/Factory/Domain.pm blib/lib/MogileFS/Factory/Domain.pm >cp lib/MogileFS/Class.pm blib/lib/MogileFS/Class.pm >cp lib/MogileFS/Connection/Client.pm blib/lib/MogileFS/Connection/Client.pm >cp lib/MogileFS/FID.pm blib/lib/MogileFS/FID.pm >cp lib/MogileFS/ReplicationPolicy/Union.pm blib/lib/MogileFS/ReplicationPolicy/Union.pm >cp lib/MogileFS/Exception.pm blib/lib/MogileFS/Exception.pm >cp lib/MogileFS/Factory/Host.pm blib/lib/MogileFS/Factory/Host.pm >cp lib/MogileFS/ReplicationPolicy/MultipleDevices.pm blib/lib/MogileFS/ReplicationPolicy/MultipleDevices.pm >cp lib/MogileFS/Connection/Poolable.pm blib/lib/MogileFS/Connection/Poolable.pm >cp lib/MogileFS/Factory.pm blib/lib/MogileFS/Factory.pm >cp lib/MogileFS/Connection/Mogstored.pm blib/lib/MogileFS/Connection/Mogstored.pm >cp lib/MogileFS/DeviceState.pm blib/lib/MogileFS/DeviceState.pm >cp lib/MogileFS/Factory/Device.pm blib/lib/MogileFS/Factory/Device.pm >cp lib/MogileFS/Domain.pm blib/lib/MogileFS/Domain.pm >cp lib/MogileFS/DevFID.pm blib/lib/MogileFS/DevFID.pm >cp lib/MogileFS/ProcManager.pm blib/lib/MogileFS/ProcManager.pm >cp lib/MogileFS/Server.pm blib/lib/MogileFS/Server.pm >cp lib/MogileFS/Overview.pm blib/lib/MogileFS/Overview.pm >cp lib/MogileFS/Host.pm blib/lib/MogileFS/Host.pm >cp lib/MogileFS/Test.pm blib/lib/MogileFS/Test.pm >cp lib/MogileFS/Checksum.pm blib/lib/MogileFS/Checksum.pm >cp lib/MogileFS/IOStatWatcher.pm blib/lib/MogileFS/IOStatWatcher.pm >cp lib/MogileFS/HTTPFile.pm blib/lib/MogileFS/HTTPFile.pm >cp lib/MogileFS/Store.pm blib/lib/MogileFS/Store.pm >cp lib/MogileFS/ReplicationPolicy.pm blib/lib/MogileFS/ReplicationPolicy.pm >cp lib/MogileFS/Config.pm blib/lib/MogileFS/Config.pm >cp lib/MogileFS/Rebalance.pm blib/lib/MogileFS/Rebalance.pm >cp lib/MogileFS/Factory/Class.pm blib/lib/MogileFS/Factory/Class.pm >cp lib/MogileFS/ReplicationPolicy/MultipleHosts.pm blib/lib/MogileFS/ReplicationPolicy/MultipleHosts.pm >cp lib/MogileFS/Store/Postgres.pm blib/lib/MogileFS/Store/Postgres.pm >cp lib/Mogstored/SideChannelListener.pm blib/lib/Mogstored/SideChannelListener.pm >cp lib/MogileFS/Worker/Query.pm blib/lib/MogileFS/Worker/Query.pm >cp lib/Mogstored/HTTPServer.pm blib/lib/Mogstored/HTTPServer.pm >cp lib/Mogstored/HTTPServer/Perlbal.pm blib/lib/Mogstored/HTTPServer/Perlbal.pm >cp lib/MogileFS/Worker/Monitor.pm blib/lib/MogileFS/Worker/Monitor.pm >cp lib/MogileFS/Worker/Reaper.pm blib/lib/MogileFS/Worker/Reaper.pm >cp lib/Mogstored/TaskQueue.pm blib/lib/Mogstored/TaskQueue.pm >cp lib/Mogstored/ChildProcess/DiskUsage.pm blib/lib/Mogstored/ChildProcess/DiskUsage.pm >cp lib/MogileFS/Worker/Delete.pm blib/lib/MogileFS/Worker/Delete.pm >cp lib/MogileFS/Worker/JobMaster.pm blib/lib/MogileFS/Worker/JobMaster.pm >cp lib/MogileFS/Worker.pm blib/lib/MogileFS/Worker.pm >cp lib/Mogstored/SideChannelClient.pm blib/lib/Mogstored/SideChannelClient.pm >cp lib/Mogstored/HTTPServer/Lighttpd.pm blib/lib/Mogstored/HTTPServer/Lighttpd.pm >cp lib/MogileFS/Worker/Replicate.pm blib/lib/MogileFS/Worker/Replicate.pm >cp lib/Mogstored/HTTPServer/None.pm blib/lib/Mogstored/HTTPServer/None.pm >cp lib/MogileFS/Worker/Fsck.pm blib/lib/MogileFS/Worker/Fsck.pm >cp lib/Mogstored/ChildProcess/IOStat.pm blib/lib/Mogstored/ChildProcess/IOStat.pm >cp lib/Mogstored/HTTPServer/Apache.pm blib/lib/Mogstored/HTTPServer/Apache.pm >cp lib/Mogstored/FIDStatter.pm blib/lib/Mogstored/FIDStatter.pm >cp lib/Mogstored/HTTPServer/Nginx.pm blib/lib/Mogstored/HTTPServer/Nginx.pm >cp lib/MogileFS/Util.pm blib/lib/MogileFS/Util.pm >cp lib/Mogstored/ChildProcess.pm blib/lib/Mogstored/ChildProcess.pm >cp mogautomount blib/script/mogautomount >"/usr/bin/perl" -MExtUtils::MY -e 'MY->fixin(shift)' -- blib/script/mogautomount >cp mogdbsetup blib/script/mogdbsetup >"/usr/bin/perl" -MExtUtils::MY -e 'MY->fixin(shift)' -- blib/script/mogdbsetup >cp mogilefsd blib/script/mogilefsd >"/usr/bin/perl" -MExtUtils::MY -e 'MY->fixin(shift)' -- blib/script/mogilefsd >cp mogstored blib/script/mogstored >"/usr/bin/perl" -MExtUtils::MY -e 'MY->fixin(shift)' -- blib/script/mogstored >Manifying 3 pod documents >>>> Source compiled. >>>> Test phase: dev-perl/MogileFS-Server-2.730.0 >PERL_DL_NONLAZY=1 "/usr/bin/perl" "-MExtUtils::Command::MM" "-MTest::Harness" "-e" "undef *Test::Harness::Switches; test_harness(1, 'blib/lib', 'blib/arch')" t/*.t >t/00-startup.t ................ >Running. >ok 1 - got mogstored1 >Running. >ok 2 - got mogstored2 >Option dbuser requires an argument >[Sat Feb 8 13:17:18 2020] beginning run >[Sat Feb 8 13:17:19 2020] Job monitor has only 0, wants 1, making 1. >ok 3 >ok 4 - Store provides sane unix_timestamp >[Sat Feb 8 13:17:20 2020] Job job_master has only 0, wants 1, making 1. >[Sat Feb 8 13:17:20 2020] Job replicate has only 0, wants 1, making 1. >[Sat Feb 8 13:17:20 2020] Job queryworker has only 0, wants 2, making 2. >[Sat Feb 8 13:17:20 2020] Job fsck has only 0, wants 1, making 1. >[Sat Feb 8 13:17:20 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:17:20 2020] Job reaper has only 0, wants 1, making 1. >ok 5 - test ping worked >Error: Crashed on purpose at lib/MogileFS/Worker/Query.pm line 186. > >[Sat Feb 8 13:17:20 2020] [queryworker(63)] Error running command 'test': Crashed on purpose at lib/MogileFS/Worker/Query.pm line 186. >ok 6 - crash didn't >ok 7 - test ping again worked >[Sat Feb 8 13:17:21 2020] Child 64 (queryworker) died: 0 (expected) >ok 8 - set 1 queryworker >ok 9 - response matches expected >[Sat Feb 8 13:17:22 2020] Job queryworker has only 1, wants 2, making 1. >ok 10 - restored 2 queryworkers >ok 11 - created todie domain >ok 12 - delete todie domain >Error deleting domain: domain_not_found Domain not found >ok 13 - didn't delete todie domain again >ok 14 - created def domain >ok 15 - def dmid retrieved >ok 16 >ok 17 - modified default to have mindevcount=3 >ok 18 - default class has classid=0 >ok 19 >ok 20 - remove def domain >ok 21 - def nonexistent >ok 22 - def/default class nonexistent >ok 23 - class listing is unchanged >ok 24 - created hasclass domain >ok 25 - created nodel class >Error deleting domain: domain_has_classes domain_has_classes >ok 26 - didn't delete hasclass domain >ok 27 - created nodel class >ok 28 - didn't delete hasclass domain >ok 29 - created test domain >ok 30 - created 1copy class in testdom >ok 31 - created 2copies class in testdom >ok 32 - created a specific policy class >ok 33 - created hostA >ok 34 - created hostB >ok 35 - created dev1 on hostA >[Sat Feb 8 13:17:50 2020] [monitor(60)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev1/usage >[Sat Feb 8 13:17:50 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 36 - created dev2 on hostA >[Sat Feb 8 13:17:50 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:50 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:50 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:50 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:50 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:50 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:50 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:50 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >ok 37 - created dev3 on hostB >[Sat Feb 8 13:17:50 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:50 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:50 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:50 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:50 2020] [monitor(60)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.2:7500/dev3/usage >[Sat Feb 8 13:17:50 2020] [monitor(60)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 38 - created dev4 on hostB >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >ok 39 - waited for monitor >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:17:51 2020] [monitor(60)] write_error: no HTTP request made >[Sat Feb 8 13:17:51 2020] [monitor(60)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >ok 40 - waited for monitor >Error: no_devices No devices found to store file at t/00-startup.t line 172, <Sock_127.0.0.1:7001> line 6. >[Sat Feb 8 13:17:51 2020] ending run due to SIGTERM ># Tests were run but no plan was declared and done_testing() was not seen. ># Looks like your test exited with 255 just after 40. >Dubious, test returned 255 (wstat 65280, 0xff00) >All 40 subtests passed >t/01-domain-class.t ........... >ok 1 - got a domain factory >ok 2 - got a class factory >ok 3 - factories are not the same singleton >ok 4 - made a new domain object >ok 5 - domain id is 1 >ok 6 - domain namespace is toast >ok 7 - got a class object >ok 8 - class id is 1 >ok 9 - class name is fried >ok 10 - class can find a domain object >ok 11 - got the right domain from get_by_id >ok 12 - got the right domain from get_by_name >ok 13 - got two domains back from get_all >ok 14 - and both are domains >ok 15 - and both are domains >ok 16 - and both are not the same >ok 17 - got a mapped class hash >ok 18 - got the right class set >ok 19 - class removed from factory >ok 20 - found three classes >ok 21 - found the blue class >ok 22 - did not find the fried class >ok 23 - class can create policy object >ok 24 - new domain stored in database: 1 >ok 25 - new class1 stored in database >ok 26 - new class2 stored in database >ok 27 - can set mindevcount >ok 28 - can set replpolicy >ok 29 - can rename class >ok 30 - can set checksum type >ok 31 - can unset checksum type >ok 32 - domain foo exists >ok 33 - and the id is 1 >ok 34 - class bar came back >ok 35 - class baz came back as boo >1..35 >ok >t/02-host-device.t ............ >ok 1 - got a host factory >ok 2 - got a device factory >ok 3 - factories are not the same singleton >ok 4 - made a new host object >ok 5 - host id is 1 >ok 6 - host name is foo >ok 7 - made a new dev object >ok 8 - dev id is 1 >ok 9 - name of devs host is foo >ok 10 - can_delete_from works >ok 11 - can_read_from works >ok 12 - should_get_new_files works >ok 13 - host is not alive when down >ok 14 - can_read_from for device fails when host is down >ok 15 - device should not be readable when host is down >ok 16 - host is not alive when dead >ok 17 - can_read_from for device fails when host is dead >ok 18 - device should not be readable when host is dead >ok 19 - device is readable from again >ok 20 - device should not get new files >ok 21 - device is readable from again >ok 22 - device should get new files again >ok 23 - device is NOT readable when down >ok 24 - device is NOT readable when dead >ok 25 - device readable when readonly >ok 26 - device readable when drain >ok 27 - device readable when alive >ok 28 - device can be read from by monitor of unreachable >ok 29 - device should not be read from by non-monitor workers >ok 30 - device not readable >ok 31 - device not writeable >ok 32 - device is unreachable >ok 33 - device is readable again by non-monitor workers >ok 34 - device writable again >ok 35 - device is reachable again >ok 36 - device should not be read from when observed unreachable >ok 37 - device should be read from when observed readable >ok 38 - device should be read from when observed writeable >ok 39 - new host got id 1 >ok 40 - created dev1 >ok 41 - created dev2 >ok 42 - updated host DB entry >ok 43 - updated dev1 DB entry >ok 44 - updated dev2 DB entry >ok 45 - host is as expected >ok 46 - dev1 is as expected >ok 47 - dev2 is as expected >1..47 >ok >t/10-weighting.t .............. >Running. >ok 1 - got mogstored1 >Running. >ok 2 - got mogstored2 >Option dbuser requires an argument >[Sat Feb 8 13:18:22 2020] beginning run >[Sat Feb 8 13:18:22 2020] Job monitor has only 0, wants 1, making 1. >ok 3 >[Sat Feb 8 13:18:23 2020] Job reaper has only 0, wants 1, making 1. >[Sat Feb 8 13:18:23 2020] Job job_master has only 0, wants 1, making 1. >[Sat Feb 8 13:18:23 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:18:23 2020] Job fsck has only 0, wants 1, making 1. >[Sat Feb 8 13:18:23 2020] Job queryworker has only 0, wants 2, making 2. >[Sat Feb 8 13:18:23 2020] Job replicate has only 0, wants 1, making 1. >ok 4 - created test domain >ok 5 - created 2copies class in testdom >ok 6 - created hostA >ok 7 - created hostB >[Sat Feb 8 13:18:31 2020] [monitor(135)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev1/usage >[Sat Feb 8 13:18:31 2020] [monitor(135)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 8 - created dev1 on hostA >[Sat Feb 8 13:18:31 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:31 2020] [monitor(135)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:18:31 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:31 2020] [monitor(135)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >ok 9 - created dev2 on hostB >[Sat Feb 8 13:18:32 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:32 2020] [monitor(135)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:18:32 2020] [monitor(135)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.2:7500/dev2/usage >[Sat Feb 8 13:18:32 2020] [monitor(135)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >[Sat Feb 8 13:18:33 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:33 2020] [monitor(135)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:18:33 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:33 2020] [monitor(135)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 10 - set dev1 weight=50 on hostA >[Sat Feb 8 13:18:33 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:33 2020] [monitor(135)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:18:33 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:33 2020] [monitor(135)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 11 - set dev2 weight=50 on hostB >[Sat Feb 8 13:18:33 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:33 2020] [monitor(135)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:18:33 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:33 2020] [monitor(135)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 12 - waited for monitor >[Sat Feb 8 13:18:33 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:33 2020] [monitor(135)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:18:33 2020] [monitor(135)] write_error: no HTTP request made >[Sat Feb 8 13:18:33 2020] [monitor(135)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 13 - waited for monitor >not ok 14 - got filehandle > ># Failed test 'got filehandle' ># at t/10-weighting.t line 94. >Error: no_devices No devices found to store file at t/10-weighting.t line 96, <Sock_127.0.0.1:7001> line 3. >[Sat Feb 8 13:18:33 2020] ending run due to SIGTERM ># Tests were run but no plan was declared and done_testing() was not seen. ># Looks like your test exited with 255 just after 14. >Dubious, test returned 255 (wstat 65280, 0xff00) >Failed 1/14 subtests >t/20-filepaths.t .............. >1..0 # SKIP Filepaths plugin has been separated from the server, a bit of work is needed to make the tests run again. >skipped: Filepaths plugin has been separated from the server, a bit of work is needed to make the tests run again. >t/30-rebalance.t .............. >Running. >ok 1 - got mogstored1 >Running. >ok 2 - got mogstored2 >Option dbuser requires an argument >[Sat Feb 8 13:18:48 2020] beginning run >[Sat Feb 8 13:18:48 2020] Job monitor has only 0, wants 1, making 1. >ok 3 >[Sat Feb 8 13:18:50 2020] Job fsck has only 0, wants 1, making 1. >[Sat Feb 8 13:18:50 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:18:50 2020] Job replicate has only 0, wants 1, making 1. >[Sat Feb 8 13:18:50 2020] Job job_master has only 0, wants 1, making 1. >[Sat Feb 8 13:18:50 2020] Job queryworker has only 0, wants 2, making 2. >[Sat Feb 8 13:18:50 2020] Job reaper has only 0, wants 1, making 1. >ok 4 - test ping worked >ok 5 - created test domain >ok 6 - created 1copy class in testdom >ok 7 - created 2copies class in testdom >ok 8 - created hostA >ok 9 - created hostB >ok 10 - created dev1 on hostA >[Sat Feb 8 13:19:01 2020] [monitor(179)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev2/usage >[Sat Feb 8 13:19:01 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >[Sat Feb 8 13:19:01 2020] [monitor(179)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev1/usage >[Sat Feb 8 13:19:01 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 11 - created dev2 on hostA >[Sat Feb 8 13:19:02 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:02 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:02 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:02 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:02 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:02 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:02 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:02 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >ok 12 - created dev3 on hostB >[Sat Feb 8 13:19:02 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:02 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:02 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:02 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:02 2020] [monitor(179)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.2:7500/dev3/usage >[Sat Feb 8 13:19:02 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 13 - created dev4 on hostB >[Sat Feb 8 13:19:03 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:03 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:03 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:03 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:03 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:03 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:03 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:03 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >Running. >ok 14 - got mogstored3 >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >ok 15 - created hostC >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:05 2020] [monitor(179)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.3:7500/dev5/usage >[Sat Feb 8 13:19:05 2020] [monitor(179)] Port 7500 not listening on 127.0.1.3 (http://127.0.1.3:7500/dev5/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 16 - created dev5 on hostC >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.3 (http://127.0.1.3:7500/dev6/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.3 (http://127.0.1.3:7500/dev5/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >ok 17 - created dev6 on hostC >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.3 (http://127.0.1.3:7500/dev6/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.3 (http://127.0.1.3:7500/dev5/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >ok 18 - waited for monitor >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev4/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.3 (http://127.0.1.3:7500/dev6/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.3 (http://127.0.1.3:7500/dev5/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:06 2020] [monitor(179)] write_error: no HTTP request made >[Sat Feb 8 13:19:06 2020] [monitor(179)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev3/usage)? Error was: 500 write_error >ok 19 - waited for monitor ># Creating 300 files... >Failed to create manyhundred_1: no_devices No devices found to store file at t/30-rebalance.t line 94, <Sock_127.0.0.1:7001> line 4. >[Sat Feb 8 13:19:06 2020] ending run due to SIGTERM ># Tests were run but no plan was declared and done_testing() was not seen. ># Looks like your test exited with 255 just after 19. >Dubious, test returned 255 (wstat 65280, 0xff00) >All 19 subtests passed >t/40-httpfile.t ............... >Running. >Option dbuser requires an argument >[Sat Feb 8 13:19:19 2020] beginning run >[Sat Feb 8 13:19:19 2020] Job monitor has only 0, wants 1, making 1. >ok 1 >[Sat Feb 8 13:19:20 2020] Job reaper has only 0, wants 1, making 1. >[Sat Feb 8 13:19:20 2020] Job job_master has only 0, wants 1, making 1. >[Sat Feb 8 13:19:20 2020] Job queryworker has only 0, wants 2, making 2. >[Sat Feb 8 13:19:20 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:19:20 2020] Job fsck has only 0, wants 1, making 1. >[Sat Feb 8 13:19:20 2020] Job replicate has only 0, wants 1, making 1. >ok 2 - test ping worked >ok 3 - created test domain >ok 4 - created 1copy class in testdom >ok 5 - created hostA >ok 6 - created dev1 on hostA >[Sat Feb 8 13:19:28 2020] [monitor(231)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev1/usage >[Sat Feb 8 13:19:28 2020] [monitor(231)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 7 - waited for monitor >[Sat Feb 8 13:19:28 2020] [monitor(231)] write_error: no HTTP request made >[Sat Feb 8 13:19:28 2020] [monitor(231)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:28 2020] [monitor(231)] write_error: no HTTP request made >[Sat Feb 8 13:19:28 2020] [monitor(231)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >ok 8 - waited for monitor >Failed to create file: no_devices No devices found to store file at t/40-httpfile.t line 72, <Sock_127.0.0.1:7001> line 4. >[Sat Feb 8 13:19:28 2020] ending run due to SIGTERM ># Tests were run but no plan was declared and done_testing() was not seen. ># Looks like your test exited with 255 just after 8. >Dubious, test returned 255 (wstat 65280, 0xff00) >All 8 subtests passed >t/50-checksum.t ............... >Running. >ok 1 - got mogstored1 >Running. >ok 2 - got mogstored2 >Waiting on usage... >Option dbuser requires an argument >[Sat Feb 8 13:19:34 2020] beginning run >[Sat Feb 8 13:19:34 2020] Job monitor has only 0, wants 1, making 1. >ok 3 >[Sat Feb 8 13:19:36 2020] Job fsck has only 0, wants 1, making 1. >[Sat Feb 8 13:19:36 2020] Job replicate has only 0, wants 1, making 1. >[Sat Feb 8 13:19:36 2020] Job job_master has only 0, wants 1, making 1. >[Sat Feb 8 13:19:36 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:19:36 2020] Job queryworker has only 0, wants 2, making 2. >[Sat Feb 8 13:19:36 2020] Job reaper has only 0, wants 1, making 1. >ok 4 - created test domain >ok 5 - created 2copies class in testdom >ok 6 - created 1copy class in testdom >ok 7 - created hostA >ok 8 - created hostB >ok 9 - created dev1 on hostA >[Sat Feb 8 13:19:46 2020] [monitor(264)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev1/usage >[Sat Feb 8 13:19:46 2020] [monitor(264)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >[Sat Feb 8 13:19:46 2020] [monitor(264)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.2:7500/dev2/usage >[Sat Feb 8 13:19:46 2020] [monitor(264)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >[Sat Feb 8 13:19:46 2020] [monitor(264)] write_error: no HTTP request made >[Sat Feb 8 13:19:46 2020] [monitor(264)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:46 2020] [monitor(264)] write_error: no HTTP request made >[Sat Feb 8 13:19:46 2020] [monitor(264)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 10 - created dev2 on hostB >[Sat Feb 8 13:19:47 2020] [monitor(264)] write_error: no HTTP request made >[Sat Feb 8 13:19:47 2020] [monitor(264)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:47 2020] [monitor(264)] write_error: no HTTP request made >[Sat Feb 8 13:19:47 2020] [monitor(264)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 11 - waited for monitor >[Sat Feb 8 13:19:47 2020] [monitor(264)] write_error: no HTTP request made >[Sat Feb 8 13:19:47 2020] [monitor(264)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:19:47 2020] [monitor(264)] write_error: no HTTP request made >[Sat Feb 8 13:19:47 2020] [monitor(264)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 12 - waited for monitor >Can't use an undefined value as a HASH reference at t/50-checksum.t line 93, <Sock_127.0.0.1:7001> line 3. >[Sat Feb 8 13:19:47 2020] ending run due to SIGTERM ># Tests were run but no plan was declared and done_testing() was not seen. ># Looks like your test exited with 255 just after 12. >Dubious, test returned 255 (wstat 65280, 0xff00) >All 12 subtests passed >t/51-checksum_class_change.t .. >Running. >ok 1 - got mogstored1 >Running. >ok 2 - got mogstored2 >Waiting on usage... >Option dbuser requires an argument >[Sat Feb 8 13:19:59 2020] beginning run >[Sat Feb 8 13:19:59 2020] Job monitor has only 0, wants 1, making 1. >ok 3 >[Sat Feb 8 13:20:00 2020] Job replicate has only 0, wants 1, making 1. >[Sat Feb 8 13:20:00 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:20:00 2020] Job queryworker has only 0, wants 2, making 2. >[Sat Feb 8 13:20:00 2020] Job fsck has only 0, wants 1, making 1. >[Sat Feb 8 13:20:00 2020] Job reaper has only 0, wants 1, making 1. >[Sat Feb 8 13:20:00 2020] Job job_master has only 0, wants 1, making 1. >ok 4 - created test domain >ok 5 - created changer class in testdom with hashtype=MD5 >ok 6 - created hostA >ok 7 - created hostB >[Sat Feb 8 13:20:08 2020] [monitor(306)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev1/usage >[Sat Feb 8 13:20:08 2020] [monitor(306)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 8 - created dev1 on hostA >[Sat Feb 8 13:20:08 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:08 2020] [monitor(306)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:09 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:09 2020] [monitor(306)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >ok 9 - created dev2 on hostB >[Sat Feb 8 13:20:09 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:09 2020] [monitor(306)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:09 2020] [monitor(306)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.2:7500/dev2/usage >[Sat Feb 8 13:20:09 2020] [monitor(306)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >[Sat Feb 8 13:20:09 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:09 2020] [monitor(306)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:09 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:09 2020] [monitor(306)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >ok 10 - waited for monitor >[Sat Feb 8 13:20:09 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:09 2020] [monitor(306)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:09 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:09 2020] [monitor(306)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >ok 11 - waited for monitor >[Sat Feb 8 13:20:11 2020] Child 308 (replicate) died: 0 (expected) >[Sat Feb 8 13:20:11 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:11 2020] [monitor(306)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:11 2020] [monitor(306)] write_error: no HTTP request made >[Sat Feb 8 13:20:11 2020] [monitor(306)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >Can't use an undefined value as a HASH reference at t/51-checksum_class_change.t line 84, <Sock_127.0.0.1:7001> line 3. >[Sat Feb 8 13:20:11 2020] ending run due to SIGTERM ># Tests were run but no plan was declared and done_testing() was not seen. ># Looks like your test exited with 255 just after 11. >Dubious, test returned 255 (wstat 65280, 0xff00) >All 11 subtests passed >t/60-fsck.t ................... >Running. >ok 1 - got mogstored1 >Running. >ok 2 - got mogstored2 >Option dbuser requires an argument >[Sat Feb 8 13:20:24 2020] beginning run >[Sat Feb 8 13:20:24 2020] Job monitor has only 0, wants 1, making 1. >ok 3 >[Sat Feb 8 13:20:25 2020] Job reaper has only 0, wants 1, making 1. >[Sat Feb 8 13:20:25 2020] Job fsck has only 0, wants 1, making 1. >[Sat Feb 8 13:20:25 2020] Job queryworker has only 0, wants 2, making 2. >[Sat Feb 8 13:20:25 2020] Job replicate has only 0, wants 1, making 1. >[Sat Feb 8 13:20:25 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:20:25 2020] Job job_master has only 0, wants 1, making 1. >ok 4 - created test domain >ok 5 - created 2copies class in testdom >ok 6 - created 1copy class in testdom >ok 7 - created hostA >ok 8 - created hostB >ok 9 - created dev1 on hostA >[Sat Feb 8 13:20:36 2020] [monitor(346)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.2:7500/dev2/usage >[Sat Feb 8 13:20:36 2020] [monitor(346)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >[Sat Feb 8 13:20:36 2020] [monitor(346)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev1/usage >[Sat Feb 8 13:20:36 2020] [monitor(346)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >[Sat Feb 8 13:20:36 2020] [monitor(346)] write_error: no HTTP request made >[Sat Feb 8 13:20:36 2020] [monitor(346)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:36 2020] [monitor(346)] write_error: no HTTP request made >[Sat Feb 8 13:20:36 2020] [monitor(346)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 10 - created dev2 on hostB >[Sat Feb 8 13:20:37 2020] [monitor(346)] write_error: no HTTP request made >[Sat Feb 8 13:20:37 2020] [monitor(346)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:37 2020] [monitor(346)] write_error: no HTTP request made >[Sat Feb 8 13:20:37 2020] [monitor(346)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 11 - waited for monitor >[Sat Feb 8 13:20:37 2020] [monitor(346)] write_error: no HTTP request made >[Sat Feb 8 13:20:37 2020] [monitor(346)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:37 2020] [monitor(346)] write_error: no HTTP request made >[Sat Feb 8 13:20:37 2020] [monitor(346)] Port 7500 not listening on 127.0.1.2 (http://127.0.1.2:7500/dev2/usage)? Error was: 500 write_error >ok 12 - waited for monitor >Can't use an undefined value as a symbol reference at t/60-fsck.t line 168, <Sock_127.0.0.1:7001> line 3. >[Sat Feb 8 13:20:37 2020] ending run due to SIGTERM ># Tests were run but no plan was declared and done_testing() was not seen. ># Looks like your test exited with 255 just after 12. >Dubious, test returned 255 (wstat 65280, 0xff00) >All 12 subtests passed >t/70-reaper.t ................. >Running. >ok 1 - got mogstored >Option dbuser requires an argument >[Sat Feb 8 13:20:48 2020] beginning run >[Sat Feb 8 13:20:48 2020] Job monitor has only 0, wants 1, making 1. >ok 2 >[Sat Feb 8 13:20:50 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:20:50 2020] Job replicate has only 0, wants 1, making 1. >[Sat Feb 8 13:20:50 2020] Job job_master has only 0, wants 1, making 1. >[Sat Feb 8 13:20:50 2020] Job reaper has only 0, wants 1, making 1. >[Sat Feb 8 13:20:50 2020] Job fsck has only 0, wants 1, making 1. >[Sat Feb 8 13:20:50 2020] Job queryworker has only 0, wants 2, making 2. >ok 3 - created hostA >[Sat Feb 8 13:20:52 2020] [monitor(384)] read_response_headers: Server closed connection without sending any data back: GET http://127.0.1.1:7500/dev1/usage >[Sat Feb 8 13:20:52 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 read_response_headers: Server closed connection without sending any data back >ok 4 - created dev1 on hostA >[Sat Feb 8 13:20:52 2020] [monitor(384)] write_error: no HTTP request made >[Sat Feb 8 13:20:52 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:52 2020] [monitor(384)] write_error: no HTTP request made >[Sat Feb 8 13:20:52 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >ok 5 - created dev2 on hostA >[Sat Feb 8 13:20:52 2020] [monitor(384)] write_error: no HTTP request made >[Sat Feb 8 13:20:52 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:52 2020] [monitor(384)] write_error: no HTTP request made >[Sat Feb 8 13:20:52 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:53 2020] [monitor(384)] write_error: no HTTP request made >[Sat Feb 8 13:20:53 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:53 2020] [monitor(384)] write_error: no HTTP request made >[Sat Feb 8 13:20:53 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >ok 6 - created test domain >[Sat Feb 8 13:20:53 2020] [monitor(384)] write_error: no HTTP request made >[Sat Feb 8 13:20:53 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev1/usage)? Error was: 500 write_error >[Sat Feb 8 13:20:53 2020] [monitor(384)] write_error: no HTTP request made >[Sat Feb 8 13:20:53 2020] [monitor(384)] Port 7500 not listening on 127.0.1.1 (http://127.0.1.1:7500/dev2/usage)? Error was: 500 write_error >ok 7 - created 2copies class in testdom >ok 8 - set queue_rate_for_reaper >not ok 9 - got filehandle > ># Failed test 'got filehandle' ># at t/70-reaper.t line 58. >Can't use an undefined value as a symbol reference at t/70-reaper.t line 59, <Sock_127.0.0.1:7001> line 1. >[Sat Feb 8 13:20:53 2020] ending run due to SIGTERM ># Tests were run but no plan was declared and done_testing() was not seen. ># Looks like your test exited with 255 just after 9. >Dubious, test returned 255 (wstat 65280, 0xff00) >Failed 1/9 subtests >t/80-job_master.t ............. Option dbuser requires an argument > >[Sat Feb 8 13:21:02 2020] beginning run >[Sat Feb 8 13:21:02 2020] Job monitor has only 0, wants 1, making 1. >[Sat Feb 8 13:21:03 2020] Job reaper has only 0, wants 1, making 1. >[Sat Feb 8 13:21:03 2020] Job queryworker has only 0, wants 2, making 2. >ok 1 - wait for queryworker >ok 2 - no fsck workers >ok 3 - no delete workers >ok 4 - no replicate workers >[Sat Feb 8 13:21:04 2020] Job replicate has only 0, wants 1, making 1. >[Sat Feb 8 13:21:04 2020] Job job_master has only 0, wants 1, making 1. >[Sat Feb 8 13:21:04 2020] Job delete has only 0, wants 1, making 1. >[Sat Feb 8 13:21:04 2020] Job fsck has only 0, wants 1, making 1. >ok 5 - wait for queryworker >ok 6 - wait for fsck >ok 7 - wait for delete >ok 8 - wait for replicate >[Sat Feb 8 13:21:06 2020] Child 410 (job_master) died: 0 (expected) >[Sat Feb 8 13:21:07 2020] Child 409 (replicate) died: 0 (expected) >[Sat Feb 8 13:21:07 2020] Child 411 (delete) died: 0 (expected) >[Sat Feb 8 13:21:07 2020] Child 412 (fsck) died: 0 (expected) >ok 9 - wait for fsck to die >ok 10 - wait for delete to die >ok 11 - wait for replicate to die >1..11 >[Sat Feb 8 13:21:07 2020] ending run due to SIGTERM >ok >t/checksum.t .................. >ok 1 >ok 2 >ok 3 - is a ref >ok 4 - hex matches >ok 5 - save successfully >ok 6 - hex matches >ok 7 - MogileFS::FID->checksum works >1..7 >ok >t/fid-stat.t .................. >ok 1 - got tempdir >ok 2 - tempdir is writable >ok 3 - made statter >ok 4 - no stats on empty directory >ok 5 - no contents on empty directory >ok 6 - 500 stats >ok 7 - 500 fids found >ok 8 - found 4 files >ok 9 - and statted 4 files >ok 10 - found 3 files >ok 11 - and statted 3 files >1..11 >ok >t/http.t ...................... >ok 1 - host created >ok 2 - conn_pool_size took effect >ok 3 - HTTP response is success >ok 4 - got HTTP 200 response >ok 5 - connection placed in GET pool >ok 6 - child killed >ok 7 - child reaped >ok 8 - nothing is counted to be inflight >ok 9 - nothing is really inflight >ok 10 - connection pool task queue is empty >ok 11 - HTTP response is successful >ok 12 - response matches expected >ok 13 - child killed >ok 14 - child reaped >ok 15 - nothing is counted to be inflight >ok 16 - nothing is really inflight >ok 17 - connection pool task queue is empty >ok 18 - HTTP response is successful on trickle >ok 19 - trickled response matches expected >ok 20 - child killed >ok 21 - child reaped >ok 22 - nothing is counted to be inflight >ok 23 - nothing is really inflight >ok 24 - connection pool task queue is empty >ok 25 - HTTP response is success >ok 26 - connection placed in pool >ok 27 - child killed >ok 28 - child reaped >ok 29 - new connection placed in pool >ok 30 - reference not reused >ok 31 - child killed >ok 32 - child reaped >ok 33 - nothing is counted to be inflight >ok 34 - nothing is really inflight >ok 35 - connection pool task queue is empty >ok 36 - got HTTP response >ok 37 - HTTP response is successful >ok 38 - response matched >ok 39 - connection placed in connection pool >ok 40 - got connection >ok 41 - got HTTP response >ok 42 - HTTP response is successful >ok 43 - response matched >ok 44 - connection placed in connection pool >ok 45 - existing connection reused (\#2) >ok 46 - got HTTP response >ok 47 - HTTP response is successful >ok 48 - response matched >ok 49 - connection placed in connection pool >ok 50 - existing connection reused (\#3) >ok 51 - got HTTP response >ok 52 - HTTP response is successful >ok 53 - response matched >ok 54 - connection placed in connection pool >ok 55 - existing connection reused (\#4) >ok 56 - got HTTP response >ok 57 - HTTP response is successful >ok 58 - response matched >ok 59 - connection placed in connection pool >ok 60 - existing connection reused (\#5) >ok 61 - got HTTP response >ok 62 - HTTP response is successful >ok 63 - response matched >ok 64 - connection placed in connection pool >ok 65 - existing connection reused (\#6) >ok 66 - child killed >ok 67 - child reaped >ok 68 - nothing is counted to be inflight >ok 69 - nothing is really inflight >ok 70 - connection pool task queue is empty >[Sat Feb 8 13:21:15 2020] node_timeout: 1 (elapsed: 1.00199890136719): GET http://127.0.0.1:43265/node-time-me-out-0 >ok 71 - HTTP response is not successful >ok 72 - node_timeout hit >ok 73 - connection pool is empty >ok 74 - child killed >ok 75 - child reaped >ok 76 - nothing is counted to be inflight >ok 77 - nothing is really inflight >ok 78 - connection pool task queue is empty >[Sat Feb 8 13:21:16 2020] node_timeout: 1 (elapsed: 1.00166082382202): GET http://127.0.0.1:43265/node-time-me-out-1 >ok 79 - HTTP response is not successful >ok 80 - node_timeout hit >ok 81 - connection pool is empty >ok 82 - child killed >ok 83 - child reaped >ok 84 - nothing is counted to be inflight >ok 85 - nothing is really inflight >ok 86 - connection pool task queue is empty >ok 87 - connection pool is empty >[Sat Feb 8 13:21:16 2020] read_response_headers: Server closed connection without sending any data back: GET http://127.0.0.1:43265/drop-me >ok 88 - HTTP response is not successful >ok 89 - connection pool is empty >ok 90 - child killed >ok 91 - child reaped >ok 92 - nothing is counted to be inflight >ok 93 - nothing is really inflight >ok 94 - connection pool task queue is empty >ok 95 - connection pool is empty >[Sat Feb 8 13:21:16 2020] write_error: no HTTP request made >ok 96 - HTTP response is not successful >ok 97 - X-MFS-Error is set >ok 98 - connection pool is empty >ok 99 - nothing is counted to be inflight >ok 100 - nothing is really inflight >ok 101 - connection pool task queue is empty >1..101 >ok >t/mogstored-shutdown.t ........ (No info could be read for "-p": geteuid()=250 but you should be root.) > >ok 1 - started daemonized mogstored >(Not all processes could be identified, non-owned process info > will not be shown, you would have to be root to see it all.) >real_pid = 429 >ok 2 - got mgmt connection >ok 3 - mgmt port readable >ok 4 - gone >1..4 >ok >t/multiple-devices-replpol.t .. >ok 1 - all good >ok 2 - any device will do >ok 3 - 2 devices is enough >ok 4 - anywhere >ok 5 >ok 6 >ok 7 >ok 8 >ok 9 >ok 10 >ok 11 >ok 12 >ok 13 >ok 14 - we are happy with one copy in a drain device >ok 15 - the extra copy in drain leaves us too satisfied >1..15 >ok >t/multiple-hosts-replpol.t .... >ok 1 - all good >ok 2 - need host2 >ok 3 - need host2, even though 2 on host1 >ok 4 - anywhere >ok 5 >ok 6 >ok 7 >ok 8 >ok 9 >ok 10 >ok 11 >ok 12 >ok 13 >ok 14 - we are happy with one copy in a drain device >ok 15 - the extra copy in drain leaves us too satisfied >1..15 >ok >t/replpolicy-parsing.t ........ >ok 1 - 'got a multiple hosts policy' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 2 - got correct devcount >ok 3 - 'got a multiple hosts policy' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 4 - An object of class 'MogileFS::ReplicationPolicy::Union' isa 'MogileFS::ReplicationPolicy::Union' >ok 5 - got 2 sub policies >ok 6 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 7 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 8 - An object of class 'MogileFS::ReplicationPolicy::Union' isa 'MogileFS::ReplicationPolicy::Union' >ok 9 - got 2 sub policies >ok 10 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 11 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 12 - An object of class 'MogileFS::ReplicationPolicy::Union' isa 'MogileFS::ReplicationPolicy::Union' >ok 13 - got 2 sub policies >ok 14 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 15 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 16 - An object of class 'MogileFS::ReplicationPolicy::Union' isa 'MogileFS::ReplicationPolicy::Union' >ok 17 - got 2 sub policies >ok 18 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 19 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 20 - An object of class 'MogileFS::ReplicationPolicy::Union' isa 'MogileFS::ReplicationPolicy::Union' >ok 21 - got 2 sub policies >ok 22 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 23 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 24 - An object of class 'MogileFS::ReplicationPolicy::Union' isa 'MogileFS::ReplicationPolicy::Union' >ok 25 - got 2 sub policies >ok 26 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 27 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 28 - An object of class 'MogileFS::ReplicationPolicy::Union' isa 'MogileFS::ReplicationPolicy::Union' >ok 29 - got 2 sub policies >ok 30 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >ok 31 - An object of class 'MogileFS::ReplicationPolicy::MultipleHosts' isa 'MogileFS::ReplicationPolicy::MultipleHosts' >1..31 >ok >t/replpolicy.t ................ >1..11 >ok 1 - Expected happiness >ok 2 - ... but not too happy >ok 3 - Expected happiness >ok 4 - ... and too happy too >ok 5 - Expected unhappiness >ok 6 - List of ideal devices >ok 7 - Empty list of desperate devices >ok 8 - Expected unhappiness >ok 9 - No ideal devices >ok 10 - List of desperate devices >ok 11 - Expected temporary failure >ok >t/store-sqlite.t .............. >1..28 >ok 1 - store is sane >ok 2 - setup database >ok 3 - IPC pipe is ready >ok 4 - child wakes us up >ok 5 - child wakes parent up properly >ok 6 - fails to lock while child has lock >ok 7 - kill successful >ok 8 - waitpid successful >ok 9 - child dies correctly >ok 10 - acquire lock when child dies >ok 11 - recursion fails >ok 12 - proper error on failed lock >ok 13 - lock release >ok 14 - acquire lock with 0 timeout >ok 15 - lock release >ok 16 - redundant lock release >ok 17 - parent woken up >ok 18 - child wakes parent up properly >ok 19 - acquire lock eventually >ok 20 - waitpid successful >ok 21 - child dies correctly >ok 22 - lock release >ok 23 - parent woken up >ok 24 - child wakes parent up properly >ok 25 - kill -9 successful >ok 26 - waitpid successful >ok 27 - child was SIGKILL-ed >ok 28 - acquire lock in parent >ok >t/store.t ..................... >ok 1 - created a domain >ok 2 - created a class >ok 3 - made devfid >ok 4 - added to db >ok 5 - got fid from df >ok 6 - FID 200 on one device >ok 7 - is correct number >ok 8 - did mass insert >ok 9 - FID 101 on 2 devices >ok 10 - got a fidid >ok 11 - didn't get fidid >ok 12 - got a dup into tempfile >ok 13 - ignore_replace should return value for storage type 'SQLite' >ok 14 - ignore_replace should not die for storage type 'SQLite' >ok 15 - retry_on_deadlock return value for 'SQLite': 0E0 >ok 16 - retry_on_deadlock got an exception on bad sql 'SQLite' >ok 17 - retry_on_deadlock got a non-sql exception for 'SQLite' >ok 18 - no exception on retry_on_deadlock while inducing a deadlock >ok 19 - retry_on_deadlock returned good iteration count while inducing a deadlock >ok 20 - retry_on_deadlock got expected exemption inducing a deadlock with bad sql >ok 21 - retry_on_deadlock got proper return value and iteration while inducing a deadlock >ok 22 - checksum matches expected >ok 23 - checksum fid set correctly >ok 24 - hashtype set correctly >ok 25 - checksum matches expected >ok 26 - checksum fid set correctly >ok 27 - hashtype set correctly >ok 28 - checksum deleted OK >ok 29 - checksum delete MISS >ok 30 - undef on missing checksum >ok 31 - list_keys matches insensitively >ok 32 - case-incorrect list_keys fails to match >ok 33 - case-correct list_keys matches >ok 34 - only one row matched >ok 35 - list_keys matches insensitively (again) >1..35 >ok >t/util.t ...................... >ok 1 - weightest list >1..1 >ok > >Test Summary Report >------------------- >t/00-startup.t (Wstat: 65280 Tests: 40 Failed: 0) > Non-zero exit status: 255 > Parse errors: No plan found in TAP output >t/10-weighting.t (Wstat: 65280 Tests: 14 Failed: 1) > Failed test: 14 > Non-zero exit status: 255 > Parse errors: No plan found in TAP output >t/30-rebalance.t (Wstat: 65280 Tests: 19 Failed: 0) > Non-zero exit status: 255 > Parse errors: No plan found in TAP output >t/40-httpfile.t (Wstat: 65280 Tests: 8 Failed: 0) > Non-zero exit status: 255 > Parse errors: No plan found in TAP output >t/50-checksum.t (Wstat: 65280 Tests: 12 Failed: 0) > Non-zero exit status: 255 > Parse errors: No plan found in TAP output >t/51-checksum_class_change.t (Wstat: 65280 Tests: 11 Failed: 0) > Non-zero exit status: 255 > Parse errors: No plan found in TAP output >t/60-fsck.t (Wstat: 65280 Tests: 12 Failed: 0) > Non-zero exit status: 255 > Parse errors: No plan found in TAP output >t/70-reaper.t (Wstat: 65280 Tests: 9 Failed: 1) > Failed test: 9 > Non-zero exit status: 255 > Parse errors: No plan found in TAP output >Files=23, Tests=477, 260 wallclock secs ( 0.19 usr 0.06 sys + 15.98 cusr 4.58 csys = 20.81 CPU) >Result: FAIL >Failed 8/23 test programs. 2/477 subtests failed. >make: *** [Makefile:978: test_dynamic] Error 255 >>>> Completed testing dev-perl/MogileFS-Server-2.730.0 > >>>> Install dev-perl/MogileFS-Server-2.730.0 into /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/ >make -j1 pure_install >Manifying 3 pod documents >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/ChildProcess.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/FIDStatter.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/SideChannelClient.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/TaskQueue.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/HTTPServer.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/SideChannelListener.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/ChildProcess/IOStat.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/ChildProcess/DiskUsage.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/HTTPServer/Nginx.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/HTTPServer/Apache.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/HTTPServer/None.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/HTTPServer/Lighttpd.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/Mogstored/HTTPServer/Perlbal.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Util.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Worker.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Rebalance.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Config.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/ReplicationPolicy.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Store.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/HTTPFile.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/IOStatWatcher.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Checksum.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Test.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Host.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Overview.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Server.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/ProcManager.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/DevFID.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Domain.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/DeviceState.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Factory.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Exception.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/FID.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Class.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Device.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/ReplicationRequest.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/ConnectionPool.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Worker/Fsck.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Worker/Replicate.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Worker/JobMaster.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Worker/Delete.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Worker/Reaper.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Worker/Monitor.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Worker/Query.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/ReplicationPolicy/MultipleHosts.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/ReplicationPolicy/MultipleDevices.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/ReplicationPolicy/Union.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Factory/Class.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Factory/Device.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Factory/Host.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Factory/Domain.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Connection/Mogstored.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Connection/Poolable.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Connection/Client.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Connection/HTTP.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Connection/Worker.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Connection/Parent.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Store/Postgres.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Store/SQLite.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/lib64/perl5/vendor_perl/5.30.1/MogileFS/Store/MySQL.pm >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/share/man/man1/mogstored.1 >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/share/man/man1/mogilefsd.1 >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/share/man/man1/mogautomount.1 >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/bin/mogstored >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/bin/mogilefsd >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/bin/mogdbsetup >Installing /var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/image/usr/bin/mogautomount > * Fixing packlist file /usr/lib64/perl5/vendor_perl/5.30.1/x86_64-linux/auto/MogileFS/Server/.packlist >usage: doins.py [-h] [-g GROUP] [-o OWNER] [-m MODE] [-p] >doins.py: error: argument -g/--group: invalid _parse_group value: 'mogile' > * ERROR: dev-perl/MogileFS-Server-2.730.0::gentoo failed (install phase): > * doins failed > * > * If you need support, post the output of `emerge --info '=dev-perl/MogileFS-Server-2.730.0::gentoo'`, > * the complete build log and the output of `emerge -pqv '=dev-perl/MogileFS-Server-2.730.0::gentoo'`. > * The complete build log is located at '/var/log/portage/dev-perl:MogileFS-Server-2.730.0:20200208-121709.log'. > * For convenience, a symlink to the build log is located at '/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/temp/build.log'. > * The ebuild environment file is located at '/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/temp/environment'. > * Working directory: '/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work/MogileFS-Server-2.73' > * S: '/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work/MogileFS-Server-2.73' > * ERROR: dev-perl/MogileFS-Server-2.730.0::gentoo failed (install phase): > * newins failed > * > * If you need support, post the output of `emerge --info '=dev-perl/MogileFS-Server-2.730.0::gentoo'`, > * the complete build log and the output of `emerge -pqv '=dev-perl/MogileFS-Server-2.730.0::gentoo'`. > * The complete build log is located at '/var/log/portage/dev-perl:MogileFS-Server-2.730.0:20200208-121709.log'. > * For convenience, a symlink to the build log is located at '/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/temp/build.log'. > * The ebuild environment file is located at '/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/temp/environment'. > * Working directory: '/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work/MogileFS-Server-2.73' > * S: '/var/tmp/portage/dev-perl/MogileFS-Server-2.730.0/work/MogileFS-Server-2.73'
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 708726
:
612780
| 612782 |
612784
|
612786
|
612788
|
612790