Forum Discussion
Hi Roger667
I'm not sure of this behavior, I haven't faced this earlier and not able understand the issue. may be you can try copying the contents of the file in a new CSV and use it.
mapping CSV file is an option to convert the metadata, you can use an expression file also to achieve this. I found this easier and used this. but you can also maintain these details in an expression file and call it in a mapper to get the parquet appropriate data types.
Thanks !
Hi manichandana_ch
This error is comimg due to datetime data type in salesforce object which is mapped as bigint in the mapping sheet. But this mapping leads to loss of data and timestamp only comes as year. Can you suggest how to handle this datetime
- manichandana_ch3 years agoNew Contributor III
Hi Roger667 ,
timestamp is converted to bigint as parquet writer didn't work very well when we tried writing timestamp, hence we converted to bigint and it converts to unix time format. then we cast the timestamp columns when we/downstreams read the file. if bigint didn't work for you, you can use string format and then cast it while reading the file.Thanks !