Hm. We will need to try to isolate the source of the problem. We don't know if it is just slow because of network fs. If it is a general problem with import. Or some kind of other problem.
What if you import a smaller export, does it work?
How long it takes to edit a bigger entity in the studio?
What if you configure the template store to connect to mongodb, does it import the whole package?
Actually, when I don't persist data on NFS server (no volume on docker container) it take a minute to import data.
I benchmarked communications between container and NFS Server :
root@ac2bf523cd00:/app# time dd if=/dev/zero of=data/testfile bs=16k count=128k 131072+0 records in 131072+0 records out 2147483648 bytes (2.1 GB, 2.0 GiB) copied, 18.434 s, 116 MB/s real 0m19.292s user 0m0.029s sys 0m2.351s
I don't have exact data on editing big entities but yet it's pretty fast.
Also just tried to import 5.4Mo file and still the same issue
I'll give a shot at MongoDB
Well, here is the import with MongoDB 3.6
Not sure what are the mentioned errors. Would you be able to share the import package?
Ok so I just tried again on my computer.
The same issue. I really think that it's just a import performance issue.
I not allowed to send the import package as It contain many sensitive data.
Through I tried to delete all sensitive data and import the file, and it works !!! But yes, the import package is only 100ko now so...
Didn't found any workaround for now.. Still the same issue, but now I'm sure it's not related to NFS
It seems to be related to your particular export package if you say it doesn't work on your local pc as well.
However, I've tried the 10mb export package and it imports in 1s.
I believe it is not related to the actual content of your templates and data. You should be able to replace characters in your templates/data with some 'a' values and share the export with me.
It's been a while but we're still facing the problem.
I just sent you a zip file by mail to show you that even with ~500kb of data it breaks
Performance problem was solved here
Will be part of the next release.
The current workaround is to temporarily disable authentication before import in case it is timing out.
Thanks a lot for your help ! :)
I tried disabling authentication and it worked great. Still a bit slow but not crashing which is great ! :)
Waiting for the new release :)
Just tried your fix with the new 2.6.1 !
But I'm still facing the issue :(
If I remove authentication it works but if let it enabled it crashes during import
Hm. Also on your local?
It finishes for me within 3s and with disabled auth in 2s.
Same on my local :/
If I remember well, you use Windows, right ?
If so, have you tried on Linux ? Maybe it's os related ?
I'll try on Windows
We may try to add some more optimizations later.
However we see no issues on various OSX, Ubuntu, Windows machines.
I'm facing the same issue on Windows :/
I'll search to see if I can see where this come from