I'm using SQLAlchemy backed by psychopg2 to interact with my data inside of AWS Redshift. I'm trying to insert data and am getting a perplexing error with respect to text vs varchar data. Here is a source snipit:
conn = get_db_connection()
keys=["a","b"]
values=["aa","bb"]
tbl_qualtrics_dim = Table('qualtrics_dim', metadata,
Column('k', String(1024), primary_key=True),
Column('v', String(1024), nullable=False)
)
metadata.create_all(conn, checkfirst=True)
ins = tbl_qualtrics_dim.insert().values(k=keys, v=values)
conn.execute(ins)
This creates the table just fine as being two columns of varchar(1024). AWS doesn't support clob types. The last line throws an error though when trying to insert data:
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) column "k" is of type character varying but expression is of type text[]
HINT: You will need to rewrite or cast the expression.
[SQL: 'INSERT INTO qualtrics_dim (k, v) VALUES (%(k)s, %(v)s)'] [parameters: {'v': ['aa', 'bb'], 'k': ['a', 'b']}]
A cast as I understand it doesn't work, e.g. I change this line:
ins = tbl_qualtrics_dim.insert().values(k=cast(keys, String(1024)), v=values)
And I get a casting error:
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) cannot cast type text[] to character varying
[SQL: 'INSERT INTO qualtrics_dim (k, v) VALUES (CAST(%(param_1)s AS VARCHAR(1024)), %(v)s)'] [parameters: {'v': ['aa', 'bb'], 'param_1': ['a', 'b']}]
Any hints on what might be going wrong here?
Regards,
Chris