问题描述
我有一个包含大约 30000 行数据的文件,我想加载到 sqlite3 数据库中.有没有比为每行数据生成插入语句更快的方法?
I have a file of about 30000 lines of data that I want to load into a sqlite3 database. Is there a faster way than generating insert statements for each line of data?
数据以空格分隔并直接映射到 sqlite3 表.是否有任何类型的批量插入方法可以将卷数据添加到数据库中?
The data is space-delimited and maps directly to an sqlite3 table. Is there any sort of bulk insert method for adding volume data to a database?
如果不是内置的,有没有人设计了一些非常棒的方法来做到这一点?
Has anyone devised some deviously wonderful way of doing this if it's not built in?
我应该先问一下,是否有 C++ 方法可以从 API 中做到这一点?
I should preface this by asking, is there a C++ way to do it from the API?
推荐答案
你也可以尝试调整一些参数 以获得额外的速度.具体来说,您可能想要 PRAGMA synchronous = OFF;
.
You can also try tweaking a few parameters to get extra speed out of it. Specifically you probably want PRAGMA synchronous = OFF;
.
这篇关于在 sqlite3 中更快的批量插入?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!