Forum Discussion

darshthakkar's avatar
darshthakkar
Valued Contributor
4 years ago
Solved

Transformation rules inside mapper snap not exported to output file

Hi Team,

I have found one of the transformational rules applied on “date field” not getting exported to the final output file (i.e. excel file) when the pipeline has been executed.
Output preview on the snap does show the right data (i.e. expected one), pipeline validation also shows the right data however the file which gets generated after pipeline execution doesn’t get those details.

I have tried the following:

  • validated pipeline via shift+validate
  • tweak the transformation rule, save the pipeline and check the output in the preview
  • validated the preview file (with default 50 records)
  • deleted the file generated by “validate & execute” snap on File writer with default 50 records so that new instance of data comes in
  • executed the pipeline with different file name multiple times

It’s a simple pipeline consisting of

            Snowflake Execute --> Mapper --> Excel Formatter --> File Writer

Sharing some screenshots to help it understand better:
(1) Pipeline with 4 snaps

(2) Snowflake execute output preview in JSON format (highlighted yellow fields needs transformation)

(3) Transformation Rules on Date fields

(4) Transformation Rule: $Expire_Date.toString().replace('{Timestamp=',"").replace('.000+0000}',"Z")

(5) Output preview after transformational rule

(6) Mapper output preview shows different data

→ I’ve observed certain times that the transformation rules won’t work so I go inside the mapper, modify the rule a bit, bring it back to default rule so that I can save it (save is not enabled if there is no change thus this workaround)

(7) As shift+validate didn’t work, had to change the transformation rule, save it and then this is the output preview on mapper snap:

(8) Settings under excel formatter snap for reference:

(9) Settings under File Writer snap for reference:

(10) File generated with default 50 records due to validate & execute on File writer snap

(11) Downloaded the file that gets generated in step 10 and below is the output (expected)

(12) Executed Pipeline

(13) Output File of Executed pipeline



I’ve seen this happening numerous times particularly to fields containing timestamp in other pipelines too, am I doing something wrong over here? Any settings that needs to be changed either in Mapper, Excel Formatter or File Writer snap?

Thanking in advance for your valuable time and suggestions on this one.

Best Regards,
DT

  • The Snowflake snaps have a setting called Handle Timestamp and Date Time Data. Unfortunately, its default setting, Default Date Time format in UTC Time Zone, has less-than-ideal behavior. The object type of the timestamp objects is java.sql.Timestamp, which will result in the odd inconsistencies you’re seeing between validation vs execution, and it’s also not a very usable type on the SnapLogic expression language.

    I suggest changing this setting to the other option, SnapLogic Date Time format in Regional Time Zone. This will convert the timestamps to the type org.joda.time.DateTime, the type that the SnapLogic EL functions are designed to deal with.

    This option will also produce a DateTime with the default time zone of the plex node, which can complicate things if the node’s default timezone isn’t UTC. But based on what you’ve shown, your node’s default timezone is UTC, so this shouldn’t be a problem for you, fortunately.

    After changing that setting, try using an expression like $Expire_Date.toLocaleDateTimeString({format: "yyyy-MM-dd'T'HH:mm:ss'Z'"}).

    Hope that helps.

23 Replies

  • Hi @darshthakkar,

    Since the field you are transforming is a timestamp ( datetime ), it is a best practice to transform the date and format it with the built-in method .toLocaleDateTimeString({format: "yyyy-MM-dd'T'HH:mm:ss'Z'"}) instead of converting it to string and then manually replacing the timezone offset.

    • darshthakkar's avatar
      darshthakkar
      Valued Contributor

      Thank you @j.angelevski for the direction.
      I will try your suggestion and keep you posted on how it goes, appreciate your help though!

      Rationale behind converting it to String was: I wanted to export the raw data and it didn’t allow, error was “Flatten the file before export” thus .toString() was used. I wasn’t aware of the built-in method so THANK YOU for sharing that.

  • robin's avatar
    robin
    Former Employee

    @darshthakkar try

    Date.parse($Expire_Date.Timestamp).toLocaleDateTimeString({format: "yyyy-MM-dd'T'HH:mm:ss'Z'"})
    

    I believe that will do what you want

    • darshthakkar's avatar
      darshthakkar
      Valued Contributor

      This worked! Thank you, Robin.

      However, the exported file has no values under dates, screenshot below for your convenience:

      Just for my understanding, why would the Expire_Date not give us the same results? Is it because it’s an object vs Expire_Date.Timestamp a string? (Q-1)

      Moreover, $Expire_Date.toString().replace('{Timestamp=',"").replace('.000+0000}',"Z") did give us the expected results during output preview of the snap and after validating the pipeline however the transformation didn’t work when the pipeline was executed, what would be the rationale behind it? (Q-2)

      I want to understand this in details so that I can learn snapLogic in a better fashion and be a good snapLogic developer. Please don’t get me wrong.

      Best Regards,
      Darsh

      • darshthakkar's avatar
        darshthakkar
        Valued Contributor

        In addition to this,

        Output preview is different on the field and mapper snap, screenshots below:

        Isn’t this strange as the same snap which shows a preview as expected output in one of the fields shows a null when doing a sanity check on the entire snap.

  • The Snowflake snaps have a setting called Handle Timestamp and Date Time Data. Unfortunately, its default setting, Default Date Time format in UTC Time Zone, has less-than-ideal behavior. The object type of the timestamp objects is java.sql.Timestamp, which will result in the odd inconsistencies you’re seeing between validation vs execution, and it’s also not a very usable type on the SnapLogic expression language.

    I suggest changing this setting to the other option, SnapLogic Date Time format in Regional Time Zone. This will convert the timestamps to the type org.joda.time.DateTime, the type that the SnapLogic EL functions are designed to deal with.

    This option will also produce a DateTime with the default time zone of the plex node, which can complicate things if the node’s default timezone isn’t UTC. But based on what you’ve shown, your node’s default timezone is UTC, so this shouldn’t be a problem for you, fortunately.

    After changing that setting, try using an expression like $Expire_Date.toLocaleDateTimeString({format: "yyyy-MM-dd'T'HH:mm:ss'Z'"}).

    Hope that helps.

    • darshthakkar's avatar
      darshthakkar
      Valued Contributor

      Hi @ptaylor,

      Really appreciate you helping me on this one however the values are coming out as Null with the solution provided. Screenshots below for your reference and convenience:

      (1) Snowflake Execute snap settings changed:

      (2) Values coming as Null during output preview

      (3) Output preview of the mapper snap in JSON format

      (4) Validation file generated with default 50 records:

      (5) File generated after executing the pipeline



      Though there is one positive of this solution, my original transformation rule is working fine, not sure if its intermittent as I’ve observed that rule working fine at some instances and after a while, it will go berserk.

      Let me know your thoughts on this.

      Best Regards,
      DT

      • ptaylor's avatar
        ptaylor
        Employee

        It looks like you’re missing a $ at the beginning of the expression. Please check.

  • darshthakkar's avatar
    darshthakkar
    Valued Contributor

    With the above approach, the result in Date is coming as “Null”

    Screenshots below for reference:

  • darshthakkar's avatar
    darshthakkar
    Valued Contributor

    Providing a summary of the transformational rules applied so far:

    1. $Expire_Date.toString().replace('{Timestamp=',"").replace('.000+0000}',"Z") → Output Preview [works expected] → Exported Excel File [transformation rules don’t appear to work]

    2. $Expire_Date.toLocaleDateTimeString({format: "yyyy-MM-dd'T'HH:mm:ss'Z'"}) → Output Preview [shows Null values] → Exported Excel File [all blank values]

    3. Date.parse($Expire_Date.Timestamp).toLocaleDateTimeString({format: "yyyy-MM-dd'T'HH:mm:ss'Z'"}) → Output preview [works expected] → Exported Excel File [all blank values]

    • darshthakkar's avatar
      darshthakkar
      Valued Contributor

      I’ve tested the above transformational rules with csv file and the results are the same as discussed above

      Output preview file generated in “Manager” if the File Writer snap has Validate & Execute enabled shows the expected data with both excel and csv versions of file but shows blank when the file has been downloaded after executing the pipeline, this blows me away on why would it behave like this?

      • robin's avatar
        robin
        Former Employee

        That’s very odd indeed. Let me look into it a bit to see what is happening.

  • darshthakkar's avatar
    darshthakkar
    Valued Contributor

    Summarizing the solution as per this use case:

    1. Snowflake execute snap Settings > Handle Timestamp and Date Time Data > SnapLogic Date Time format in Regional Time Zone
    2. Use either of the below expressions in mapper snap:
      a) $Expire_Date.toLocaleDateTimeString({format: "yyyy-MM-dd'T'HH:mm:ss'Z'"})
      b) $Expire_Date.toString().replace('.000Z',"Z")

    Works only for this use case, before using expressions, check the raw data and then put the transformation rule as per the raw data and expected data.

    Signing off from this thread. Thanks again everyone (@j.angelevski, @robin and @ptaylor) for your help.