Improving sqlalchemy inserting performance for large set of data
Summary: Introduce better way of inserting data.
 python  mysql  sqlalchemy
# using add()
for d in data:
    fingerprint0 = tbModel(id=d)
    session.add(fingerprint0)
session.commit()

# using bulk_save_objects
dbs.bulk_save_objects([tbModel(id = d) for d in data])
dbs.commit()


# using bulk_insert_mappings
session.bulk_insert_mappings(tbModel,[{"id": d,} for d in data])


# using core
def run(self):
session.execute(tbModel.__table__.insert(),[{"id": d, } for d in data])
session.commit()

in these four way, the speed

core > bulk_insert_mappings > bulk_save_objects >>>>> add()

add is slower than others for almost 10 times

Just not use add when inserting large amount of data.

Latest Updated Time:2019-05-20 08:17:18