I'm facing the same issue on Windows :/
I'll search to see if I can see where this come from
I'm facing the same issue on Windows :/
I'll search to see if I can see where this come from
If I remember well, you use Windows, right ?
If so, have you tried on Linux ? Maybe it's os related ?
I'll try on Windows
Hi !
Just tried your fix with the new 2.6.1 !
But I'm still facing the issue :(
If I remove authentication it works but if let it enabled it crashes during import
Thanks a lot for your help ! :)
I tried disabling authentication and it worked great. Still a bit slow but not crashing which is great ! :)
Waiting for the new release :)
Hi :)
It's been a while but we're still facing the problem.
I just sent you a zip file by mail to show you that even with ~500kb of data it breaks
Ok so I just tried again on my computer.
The same issue. I really think that it's just a import performance issue.
I not allowed to send the import package as It contain many sensitive data.
Through I tried to delete all sensitive data and import the file, and it works !!! But yes, the import package is only 100ko now so...
Didn't found any workaround for now.. Still the same issue, but now I'm sure it's not related to NFS
Actually, when I don't persist data on NFS server (no volume on docker container) it take a minute to import data.
I benchmarked communications between container and NFS Server :
root@ac2bf523cd00:/app# time dd if=/dev/zero of=data/testfile bs=16k count=128k
131072+0 records in
131072+0 records out
2147483648 bytes (2.1 GB, 2.0 GiB) copied, 18.434 s, 116 MB/s
real 0m19.292s
user 0m0.029s
sys 0m2.351s
I don't have exact data on editing big entities but yet it's pretty fast.
Also just tried to import 5.4Mo file and still the same issue
I'll give a shot at MongoDB
Hi Jan,
Thanks, Just tried and the same thing is happening
I forgot to mention some data are added, but I don't know if all data is there :/