10-26-2021 02:39 PM
Hi All,
I am trying to load a new redshift table which I have created. While validation, I am getting
“error_code”: 1207
“error_message”: "Invalid digit, Value ‘.’, Pos 9, Type: Integer "
“column_value”: "595546349.04 "
I have tried with DOUBLE PRECISION, REAL to alter the column data type of this redshift table, still getting above error. Also used parseFloat() before using redshift for this field, not working and same error. Does anyone know how to address this? Please let me know.
Regards,
Amar
10-27-2021 03:25 PM
Here is some sql to emit schema for a table. I’m sure there are other ways in Redshift.
SELECT "tablename", "column", "type"
FROM PG_TABLE_DEF
WHERE schemaname = 'public'
AND tablename = 't1207'
->
TABLENAME COLUMN TYPE
t1207 id integer
t1207 bigdecimal double precision
10-27-2021 03:26 PM
As I could see the source is from google sheet and it is showing like this. Let me try with BIGINT and get back to you soon.
10-27-2021 03:30 PM
is annual_revenue an integer or a decimal? BIGINT will likely raise the same 1207 error trying to write a decimal value. It if should be an integer, then we should modify the pipeline to write this value as a integer.
10-27-2021 04:49 PM
Hi Matthew,
Checked the combination of type possible for this field is below.
6,515,871,900,000
6,515,871,900,000
6,515,871,900,000
6,515,871,900,000
6,515,871,900,000
6,515,871,900,000
256,800,000,000
256,800,000,000
256,800,000,000
87,351,502,445.84
76,476,000,000
64,290,705,800.14
0.00
0.00
Null
Null
Do you have any idea which data type will suite for the table creation? Please help.
Regards,
Amar.
10-27-2021 04:56 PM
DOUBLE PRECISION if we care about decimals. I see only two values (.84 and .14). If those can be stored as integers, then BIGINT. If BIGINT, we need to handle the decimal values in the pipeline by either rounding or truncating decimal values.
Oh, I also notice 0.00 and the .00 may be significant. DOUBLE PRECISION would be able to store all of these representations.