Welcome to WuJiGu Developer Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
592 views
in Technique[技术] by (71.8m points)

pandas - How to include SQLAlchemy Data types in Python dictionary

I've written an application using Python 3.6, pandas and sqlalchemy to automate the bulk loading of data into various back-end databases and the script works well.

As a brief summary, the script reads data from various excel and csv source files, one at a time, into a pandas dataframe and then uses the df.to_sql() method to write the data to a database. For maximum flexibility, I use a JSON file to provide all the configuration information including the names and types of source files, the database engine connection strings, the column titles for the source file and the column titles in the destination table.

When my script runs, it reads the JSON configuration, imports the specified source data into a dataframe, renames source columns to match the destination columns, drops any columns from the dataframe that are not required and then writes the dataframe contents to the database table using a call similar to:

df.to_sql(strTablename, con=engine, if_exists="append", index=False, chunksize=5000, schema="dbo")

The problem I have is that I would like to also specify the data types in the df.to_sql method for columns and provide them as inputs from the JSON configuration file however, this doesn't appear to be possible as all the strings in the JSON file need to be be enclosed in quotes and they don't then translate when read by my code. This is how the df.to_sql call should look:

df.to_sql(strTablename, con=engine, if_exists="append", dtype=dictDatatypes, index=False, chunksize=5000, schema="dbo")

The entries that form the dtype dictionary from my JSON file look like this:

"Data Types": {
                "EmployeeNumber": "sqlalchemy.types.NVARCHAR(length=255)",
                "Services": "sqlalchemy.types.INT()",
                "UploadActivities": "sqlalchemy.types.INT()",      
                 ......

and there a many more, one for each column.

However, when the above is read in as a dictionary, which I pass to the df.to_sql method, it doesn't work as the alchemy datatypes shouldn't be enclosed in quotes but, I can't get around this in my JSON file. The dictionary values therefore aren't recognised by pandas. They look like this:

{'EmployeeNumber': 'sqlalchemy.types.INT()', ....}

And they really need to look like this:

{'EmployeeNumber': sqlalchemy.types.INT(), ....}

Does anyone have experience of this to suggest how I might be able to have the sqlalchemy datatypes in my configuration file?

question from:https://stackoverflow.com/questions/65844704/how-to-include-sqlalchemy-data-types-in-python-dictionary

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You could use eval() to convert the string names to objects of that type:

import sqlalchemy as sa

dict_datatypes = {"EmployeeNumber": "sa.INT", "EmployeeName": "sa.String(50)"}
pprint(dict_datatypes)
"""console output:
{'EmployeeName': 'sa.String(50)', 'EmployeeNumber': 'sa.INT'}
"""
for key in dict_datatypes:
    dict_datatypes[key] = eval(dict_datatypes[key])
pprint(dict_datatypes)
"""console output:
{'EmployeeName': String(length=50),
 'EmployeeNumber': <class 'sqlalchemy.sql.sqltypes.INTEGER'>}
"""

Just be sure that you do not pass untrusted input values to functions like eval() and exec().


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to WuJiGu Developer Q&A Community for programmer and developer-Open, Learning and Share
...