I'm facing the same issue on Windows :/
I'll search to see if I can see where this come from
I'm facing the same issue on Windows :/
I'll search to see if I can see where this come from
If I remember well, you use Windows, right ?
If so, have you tried on Linux ? Maybe it's os related ?
I'll try on Windows
Hi !
Just tried your fix with the new 2.6.1 !
But I'm still facing the issue :(
If I remove authentication it works but if let it enabled it crashes during import
Thanks a lot for your help ! :)
I tried disabling authentication and it worked great. Still a bit slow but not crashing which is great ! :)
Waiting for the new release :)
Hi :)
It's been a while but we're still facing the problem.
I just sent you a zip file by mail to show you that even with ~500kb of data it breaks
Ok so I just tried again on my computer.
The same issue. I really think that it's just a import performance issue.
I not allowed to send the import package as It contain many sensitive data.
Through I tried to delete all sensitive data and import the file, and it works !!! But yes, the import package is only 100ko now so...
Didn't found any workaround for now.. Still the same issue, but now I'm sure it's not related to NFS
Actually, when I don't persist data on NFS server (no volume on docker container) it take a minute to import data.
I benchmarked communications between container and NFS Server :
root@ac2bf523cd00:/app# time dd if=/dev/zero of=data/testfile bs=16k count=128k
131072+0 records in
131072+0 records out
2147483648 bytes (2.1 GB, 2.0 GiB) copied, 18.434 s, 116 MB/s
real 0m19.292s
user 0m0.029s
sys 0m2.351s
I don't have exact data on editing big entities but yet it's pretty fast.
Also just tried to import 5.4Mo file and still the same issue
I'll give a shot at MongoDB
Hi Jan,
Thanks, Just tried and the same thing is happening
I forgot to mention some data are added, but I don't know if all data is there :/
This doesn't happen in my local pc. It's specific to my staging and production environment.
They are using docker (With Rancher 1.6) with an NFS Server.
Connectivity between servers and NFS is OK since all other services are working well.
Also, when I use MongoDB (which is also persisted in NFS) it doesn't happen.
Hi,
I'm using JSReport enterprise in Docker environment.
I'm using default values for storage (fs).
I made an export in the UI. I get a zip file of 10,8M.
At this point, it's OK
Now, when I want to import it again, I get a timeout after 2 minutes. The connection simply drops.
Request has been terminated
Possible causes: the network is offline, Origin is not allowed by Access-Control-Allow-Origin, the page is being unloaded, etc. crossDomainError@http://xxx.xxx.xxx.xxx:5488/studio/assets/client.js?e33dc3808fd7cf980ebb:32:7719
onreadystatechange@http://xxx.xxx.xxx.xxx:5488/studio/assets/client.js?e33dc3808fd7cf980ebb:32:8506
2 minutes look a bit huge regarding the data.
I'm using JSReport 2.4.0-full, also tried 2.5.0-full with no luck.
Here is my config file content :
{"httpPort":5488,"allowLocalFilesAccess":true,"store":{"provider":"fs"},"blobStorage":{"provider":"fs"},"templatingEngines":{"timeout":10000},"chrome":{"timeout":30000},"extensions":{"scripts":{"timeout":30000}}}
Can you give me advices ?