问题描述
对于数百万条记录,哪个更快:永久表或临时表?
Which is faster for millions of records: Permanent Table or Temp Tables?
我只需要将它用于 1500 万条记录.处理完成后,我们删除这些记录.
I have to use it only for 15 million records. After processing is complete, we delete these records.
推荐答案
在您的情况下,我们使用称为临时表的永久表.这是大型导入的常用方法.事实上,我们通常使用两个临时表,一个包含原始数据,一个包含清理过的数据,这使得研究提要问题变得更加容易(它们几乎总是我们的客户发现向我们发送垃圾数据的新方式和不同方式的结果,但是我们必须能够证明这一点).此外,您还可以避免诸如必须增加临时数据库或给想要使用临时数据库但必须等待它为您增长的其他用户带来问题等问题.
In your situation we use a permanent table called a staging table. This is a common method with large imports. In fact we generally use two staging tables one with the raw data and one with the cleaned up data which makes researching issues with the feed easier (they are almost always a result of new and varied ways our clients find to send us junk data, but we have to be able to prove that). Plus you avoid issues like having to grow temp db or causing issues for other users who want to use temp db but have to wait while it grows for you, etc.
您也可以使用 SSIS 并跳过暂存表,但我发现无需重新加载 50,000,000 表即可返回和研究的能力非常有用.
You can also use SSIS and skip the staging table(s), but I find the ability to go back and research without having to reload a 50,000,000 table is very helpful.
这篇关于表与临时表性能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!