jsreports-aws-s3-storage Report: control file name and directory



  • Is there a way to control the file name and the directory each report can go to on s3?

    Ideally, the report name should append the date or a number "dailybalance20210203.xls" as an example

    Additionally being able to control subfolders for where the files go on s3 would be great for permissions reasons.

    "options": {
            "reports": {
                "save": true,
                "reportName": "dailybalance",
                "blobName": "dailybalancexls"
            }
        },
    

    anay help here would be amazing.



  • You can use the options.reports.blobName. Do you have issues with it?
    I haven't tested the folders, but can't you put to the blobName something like myFolder/myblobname ?



  • On the individual report where is the best place to put this? Also, would the name appended with the encode string name as it does now. "dailybalance162769128763.pdf"

    Will it save to s3 as a unique file or overwrite it?

    thanks



  • On the individual report where is the best place to put this? Also, would the name appended with the encode string name as it does now. "dailybalance162769128763.pdf"

    I'm afraid I don't understand the question. Could you elaborate?

    Will it save to s3 as a unique file or overwrites it?

    jsreport aws s3 storage calls aws s3 upload method and I believe it overwrites the file when it already exists.



  • Sorry Jan,

    Is there a way to upload the file, have it named and spend say the date to the file name?

    dailyblancerpt20210303.pdf
    dailyblancerpt20210304.pdf

    where do I put the option for a single report
    options.reports.blobName



  • You have two options

    1. fully declare your desired blob name in the API request
    {
       "template": {
         "content": "Hello",
         "recipe": "html",
         "engine": "none"    
       },
      "options": {  
        "reports": {
          "save": true,
          "blobName": "dailyblancerpt20210303"
        }
      }    
    }
    

    This means your client code will be responsible for specifying the blob name.

    1. Implement custom jsreport script and add the logic for computing the blob name there
      https://jsreport.net/learn/scripts
    function afterRender(req, res) {   
        if (res.meta.reportsOptions) {
            res.meta.reportsOptions.blobName = "myreport" + new Date().getTime()       
        }
    }
    

    Note the second option isn't yet documented and the naming can change there in future versions, but it will be still possible to implement such a script.



  • In the case of a CSV report

    Created report

    PunterId,PaymentID,UCTRegDate,PSTRegDate,Method_Type,Status,DepOrWD,Amount
    {{for all_funding_trans}}
    {{:PunterId}},{{:PaymentID}},{{:UCTRegDate}},{{:PSTRegDate}},{{:Method_Type}},{{:Status}},{{:DepOrWD}},{{:Amount}}
    {{/for}}
    
    

    Subscript of

    function afterRender(req, res) {   
        if (res.meta.reportsOptions) {
            res.meta.reportsOptions.blobName = "csv_all_funding_trans" + new Date().getTime()       
        }
    }
    

    I do have PDF files that run on schedule go to a root s3 drive that is in the config file. So I know it is working.

    I want these csv reports to go to subfolder S3Buket_Default/dailycsv/"filenameabove"

    How can I get csv or xls files to runlike this?



  • @jan_blaha Is there a script similar to the email scripts to send files to S3 on a schedule. I have not seen an example.

    While the report scheduler saves to the default s3 bucket it would be a good enhancement to be able to select the save folder and provide the variable to filename-date in the Scheduler and that would resolve the issue, I think.



  • I want these csv reports to go to subfolder S3Buket_Default/dailycsv/"filenameabove"
    How can I get csv or xls files to runlike this?

    I believe you can just use the slashes to store in s3 folders

    function afterRender(req, res) {   
        if (res.meta.reportsOptions) {
            res.meta.reportsOptions.blobName = "/dailycsv/csv_all_funding_trans" + new Date().getTime()       
        }
    }
    

    Is there a script similar to the email scripts to send files to S3 on a schedule. I have not seen an example.

    I'm not sure I understand the problem. You can use the same script as posted above.



  • @jan_blaha

    this is the report with the recommended script.
    0_1618937026564_upload-b58a3f46-f5f1-40f8-8ff3-a9847fd60767

    When you run the report, it does not post the file in the S3

    However, when you run the report on the scheduler you get
    it puts the file with its own name at the s3 root
    0_1618937390002_upload-fab935e7-a726-4eaa-be5e-ae3d16364601

    I hope the clears up the question.



  • I see you put the function afterRender code in the template's helpers section.
    It should be inside the extra script entity and script associated with the template.
    Please go through the documentation here
    https://jsreport.net/learn/scripts



  • @jan_blaha Thanks for the help so far.

    Still battling S3 storage.

    Log from the console:

    2021-04-27T17:34:02.927Z - warn: Error during processing request at https://reports.luckiihhr.com:5489/api/report/csv_all_funding_trans
    2021-04-27T17:34:51.284Z - info: Starting rendering request 14 (user: admin)
    2021-04-27T17:34:51.284Z - info: Rendering template { name: csv_all_funding_trans, recipe: text, engine: jsrender, preview: true }
    2021-04-27T17:34:51.285Z - debug: Data item not defined for this template.
    2021-04-27T17:34:51.285Z - debug: Resources not defined for this template.
    2021-04-27T17:34:51.285Z - debug: Executing script dbo_all_funding_transactions using http-server strategy
    2021-04-27T17:34:51.954Z - debug: Executing script S3_csv_all_funding_trans using http-server strategy
    2021-04-27T17:34:51.973Z - debug: Base url not specified, skipping its injection.
    2021-04-27T17:34:51.974Z - debug: Rendering engine jsrender using http-server strategy
    2021-04-27T17:34:52.029Z - debug: Compiled template not found in the cache, compiling
    2021-04-27T17:34:52.031Z - debug: Executing recipe text
    2021-04-27T17:34:52.031Z - debug: Executing script anonymous using http-server strategy
    2021-04-27T17:34:52.093Z - info: Rendering request 14 finished in 809 ms
    2021-04-27T17:34:52.093Z - debug: Skipping storing report. 
    

    Report:

    PunterId,PaymentID,UCTRegDate,PSTRegDate,Method_Type,Status,DepOrWD,Amount
    {{for all_funding_trans}}
    {{:PunterId}},{{:PaymentID}},{{:UCTRegDate}},{{:PSTRegDate}},{{:Method_Type}},{{:Status}},{{:DepOrWD}},{{:Amount}}
    {{/for}}
    

    this calls the data and this file store script

    function afterRender(req, res) {   
        if (res.meta.reportsOptions) {
            res.meta.reportsOptions.blobName = '/optimove/csv_all_funding_trans.csv' + new Date().getTime()       
    
        }
      }
    


  • This is a request from the studio, which doesn't persist a report.
    Please try to render through API call, using some kind of postman for example.
    And include options for report persisting

    "template": { "name": "..." },
    "options": {  
        "reports": {
          "save": true    
        }
      }   
    


  • @jan_blaha I thought I would be clever and add this to the email script

    Error while executing user script. Unexpected token ':'
    evaluate-user-script.js:7
      "template": { "name": csv_all_funding_trans}
                ^
    

    the script looks like this

    const axios = require('axios')
    
    async  function beforeRender(req, res)  {
      const resData =  await axios.get('http://jsonplaceholder.typicode.com/posts')
      console.log(resData.data)
      req.data.posts = resData.data
      "template": { "name": csv_all_funding_trans}
      "options": {  
        "reports": {
          "save": true
          "blobName": "/optimove/csv_all_funding_trans.csv"    
        }
      } 
    }

Log in to reply
 

Looks like your connection to jsreport forum was lost, please wait while we try to reconnect.