Hi, is there a limit how many files I can have open? I get stuck at 1021 files.
Is this related to MVE? Please post the console output and a backtrace if possible.
Program received signal SIGABRT, Aborted.
[Switching to Thread 0x7ff426a46700 (LWP 15483)]
0x00007ff42a1c7267 in __GI_raise (sig=sig@entry=6)
at ../sysdeps/unix/sysv/linux/raise.c:55
55 ../sysdeps/unix/sysv/linux/raise.c: No such file or directory.
(gdb) bt
at ../sysdeps/unix/sysv/linux/raise.c:55
from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
from /lib/x86_64-linux-gnu/libgcc_s.so.1
at /usr/include/c++/4.9/bits/basic_string.h:240
at /usr/include/c++/4.9/bits/basic_string.h:547
proxy=proxy@entry=0x1cd4af0, init_only=init_only@entry=false)
at view.cc:790
proxy=proxy@entry=0x1cd4af0, update=update@entry=false) at view.cc:739
this@entry=0x3fc, name="original", type=type@entry=mve::IMAGE_TYPE_UINT8)
---Type
at view.cc:405
at ../../libs/mve/view.h:436
at pthread_create.c:333
at ../sysdeps/unix/sysv/linux/x86_64/clone.S:109
Are you working with the latest version of MVE? Can the problem be consistently reproduced? Is this a particularly large dataset? Where did you get the Too many files open
error message from, I cannot see it in the console output.
I reproduced it several times on two different computers and always with the latest version of MVE. The error occurs during the feature detection and always at file 1021. The dataset is huge and about 16000 pictures.
View ID 1013 (1920x1080), 1860 features, took 3268 ms.
Detecting features, view 1021 of 16295 (6.2%)...terminate called after throwing an instance of 'util::Exception'
what(): Error opening file: Too many open files
Aborted (core dumped)
Several shells limit the number of file descriptors a process is allowed to open. Within bash you can use ulimit -n
to check and set this limit. It defaults to 1024 and I assume it is maxed out in your case (stdin, stdout, stderr, prebundle and 1020 other views).
However the question is why are so many views open concurrently?
Still too many files open. ulimit of course "solves" it.
We will take another look.
Hi Jus80687,
we've tried to reproduce the bug with the patch applied but were unsuccessful. Are you sure you have compiled an up-to-date version of the code?
Also, after how many files does the error happen with the patch applied? Can you post a backtrace when it happens?
Can you send us the contents of one of the view_xxxx.mve directories of your dataset please?
Thanks in advance.
Hi Andre,
yes, i compiled the latest version of MVE. After the patch the error happens not at exactly at the same view anymore, but still around 1024. The workaround with ulimit of course still works.
I don't think it is related to a specific view since I changed the number of pictures a few times. The pictures themselves are extracted from a movie and have no EXIF information.
The console output has not changed but I have to check the backtrace again.
Thanks for your help
At this point we need additional data to reproduce the problem. Can you please send us a ZIP of one of your views? Thanks.
Any more additional information? Otherwise the issue will be closed.
Most helpful comment
Several shells limit the number of file descriptors a process is allowed to open. Within bash you can use
ulimit -n
to check and set this limit. It defaults to 1024 and I assume it is maxed out in your case (stdin, stdout, stderr, prebundle and 1020 other views).However the question is why are so many views open concurrently?
bash ulimit man page