Huge CSV and XML Files in Python

January 22, 2009. Filed under python

I, like most people, never realized I'd be dealing with large files. Oh, I knew there would be some files with megabytes of data, but I never suspected I'd be begging Perl to processhundreds of megabytes of XML, nor that this week I'd be asking Python to process 6.4 gigabytes of CSV into 6.5 gigabytes of XML1.

As a few out-of-memory experiences will teach you, the trick for dealing with large files is pretty easy: use code that treats everything as a stream. For inputs, read from disk in chunks. For outputs, frequently write to disk and let system memory forge onward unburdened.

When reading and writing files yourself, this is easier to do correctly...

from __future__ import with_statement # for python 2.5

with open('data.in','r') as fin:
    with open('data.out','w') as fout:
        for line in fin:
            fout.write(','.join(line.split(' ')))

...than it is to do incorrectly...

with open('data.in','r') as fin:
    data = fin.read()

data2 = [ ','.join(x.split(' ')) for x in data ]

with open('data.out','w') as fout:
    fout.write(data2)

...at least in simple cases.

Loading Large CSV Files in Python

Python has an excellent csv library, which can handle large files right out of the box. Sort of.

>> import csv
>> r = csv.reader(open('doc.csv', 'rb'))
>>> for row in r:
...     print row
...
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
_csv.Error: field larger than field limit (131072)

Staring at the module documentation2, I couldn't find anything of use. So I cracked open the csv.py file and confirmed what the _csv in the error message suggests: the bulk of the module's code (and the input parsing in particular) is implemented in C rather than Python.

After a while staring at that error, I began dreaming of how I would create a stream pre-processor using StringIO, but it didn't take too long to figure out I would need to recreate my own version of csv in order to accomplish that.

So back to the blogs, one of which held the magic grain of information I was looking for: csv.field_size_limit.

>>> import csv
>>> csv.field_size_limit()
131072
>>> csv.field_size_limit(1000000000)
131072
>>> csv.field_size_limit()
1000000000

Yep. That's all there is to it. The sucker just works after that.

Well, almost. I did run into an issue with a NULL byte 1.5 gigs into the data. Because the streaming code is written using C based IO, the NULL byte shorts out the reading of data in an abrupt and non-recoverable manner. To get around this we need to pre-process the stream somehow, which you could do in Python by wrapping the file with a custom class that cleans each line before returning it, but I went with some command line utilities for simplicity.

cat data.in | tr -d '\0' > data.out

After that, the 6.4 gig CSV file processed without any issues.

Creating Large XML Files in Python

This part of the process, taking each row of csv and converting it into an XML element, went fairly smoothly thanks to the xml.sax.saxutils.XMLGenerator class. The API for creating elements isn't an example of simplicity, but it is--unlike many of the more creative schemes--predictable, and has one killer feature: it correctly writes output to a stream.

As I mentioned, the mechanism for creating elements was a bit verbose, so I made a couple of wrapper functions to simplify (note that I am sending output to standard out, which lets me simply print strings to the file I am generating, for example creating the XML file's version declaration).

import sys
from xml.sax.saxutils import XMLGenerator
from xml.sax.xmlreader import AttributesNSImpl

g = XMLGenerator(sys.stdout, 'utf-8')

def start_tag(name, attr={}, body=None, namespace=None):
    attr_vals = {}
    attr_keys = {}
    for key, val in attr.iteritems():
        key_tuple = (namespace, key)
        attr_vals[key_tuple] = val
        attr_keys[key_tuple] = key

    attr2 = AttributesNSImpl(attr_vals, attr_keys)
    g.startElementNS((namespace, name), name, attr2)
    if body:
        g.characters(body)

def end_tag(name, namespace=None):
    g.endElementNS((namespace, name), name)

def tag(name, attr={}, body=None, namespace=None):
    start_tag(name, attr, body, namespace)
    end_tag(name, namespace)

From there, usage looks like this:

print """<?xml version="1.0" encoding="utf-8'?>"""
start_tag(u'list', {u'id':10})

for item in some_list:
    start_tag(u'item', {u'id': item[0]})
    tag(u'title', body=item[1])
    tag(u'desc', body=item[2])
    end_tag(u'item')

end_tag(u'list')
g.endDocument()

The one issue I did run into (in my data) was some pagebreak characters floating around (^L aka 12 aka x0c) which were tweaking the XML encoder, but you can strip them out in a variety of places, for example by rewriting the main loop:

for item in some_list:
    item = [ x.replace('\x0c','') for x in item ]
    # etc

Really, the XMLGenerator just worked, even when dealing with a quite large file.

Performance

Although my script created a different mix of XML elements than the above example, it wasn't any more complex, and had fairly reasonable performance. Processing of the 6.4 gig CSV file into a 6.5 gig XML file took between 19 - 24 minutes, which means it was able to read-process-write about five megabytes per second.

In terms of raw speed, that isn't particularly epic, but performing a similar operation (was actually XML to XML rather than CSV to XML) with Perl's XML::Twig it took eight minutes to process a ~100 megabyte file, so I'm pretty pleased with the quality of the Python standard library and how it handles large files.

The breadth and depth of the standard library really makes Python a joy to work with for these simple one-shot scripts. If only it had Perl's easier to use regex syntax...


  1. This is a peculiar nature of data, which makes it different from media: data files can--with a large system--become infinitely large. Media files, on the other hand, can be extremely dense (a couple of gigs for a high quality movie), but conform to predictable limits.

    If you are dealing with large files, you're probably dealing with a company's logs from the last decade or the entire dump of their MySQL database.

  2. I really want to like the new Python documentation. I mean, it certainly looks much better, but I think it has made it harder to actually find what I'm looking for. I think they've hit the same stumbling block as the Django documentation: the more you customize your documentation, the greater the learning curve for using your documentation.

    I think the big thing is just the incompleteness of the documentation that gives me trouble. They are certain to cover all the important and frequently used components (along with helpful overviews and examples), but the new docs often don't even mention less important methods and objects.

    For the time being, I am throwing around a lot more dir commands.

 

Huge CSV and XML Files in Python, Error: field larger than field limit (131072)的更多相关文章

  1. Java读取CSV和XML文件方法

    游戏开发中,读取策划给的配置表是必不可少的,我在之前公司,策划给的是xml表来读取,现在公司策划给的是CSV表来读取,其实大同小异,也并不是什么难点,我就简单分享下Java如何读取XML文件和CSV文 ...

  2. Nginx failing to load CSS and JS files (MIME type error)

    Nginx failing to load CSS and JS files (MIME type error) Nginx加载静态文件失败的解决方法(MIME type错误) 上线新的页面,需要在n ...

  3. 关于xml加载提示: Error on line 1 of document : 前言中不允许有内容

    我是在java中做的相关测试, 首先粘贴下报错: 读取xml配置文件:xmls\property.xml org.dom4j.DocumentException: Error on line 1 of ...

  4. Binary XML file line #2: Error inflating

    06-27 14:29:27.600: E/AndroidRuntime(6936): FATAL EXCEPTION: main 06-27 14:29:27.600: E/AndroidRunti ...

  5. Android项目部署时,发生AndroidRuntime:android.view.InflateException: Binary XML file line #168: Error inflating class错误

    这个错误也是让我纠结了一天,当时写的项目在安卓虚拟机上运行都很正常,于是当我部署到安卓手机上时,点击登陆按钮跳转到用户主界面的时候直接结束运行返回登陆界面.    当时,我仔细检查了一下自己的代码,并 ...

  6. Python--Cmd窗口运行Python时提示Fatal Python error: Py_Initialize: can't initialize sys standard streams LookupError: unknown encoding: cp65001

    源地址连接: http://www.tuicool.com/articles/ryuaUze 最近,我在把一个 Python 2 的视频下载工具 youku-lixian 改写成 Python 3,并 ...

  7. bug_ _图片_android.view.InflateException: Binary XML file line #1: Error inflating class <unknown>

    =========== 1   java.lang.RuntimeException: Unable to start activity ComponentInfo{com.zgan.communit ...

  8. bug_ _ android.view.InflateException: Binary XML file line #2: Error inflating class <unknown

    ========= 5.0     android异常“android.view.InflateException: Binary XML file line # : Error inflating ...

  9. java.lang.RuntimeException: Unable to start activity ComponentInfo{com.ex.activity/com.ex.activity.LoginActivity}: android.view.InflateException: Binary XML file line #1: Error inflating class

    java.lang.RuntimeException: Unable to start activity ComponentInfo{com.ex.activity/com.ex.activity.L ...

随机推荐

  1. 信息传递--NOIP2015 day1 T2--暴力

    这道题我用了判联通量加暴力,但联通量判炸了....然后从code[VS]上看到个不错的代码,就拿来了^_^... 基本思路是去掉环外的点,然后走每一个联通块. #include <iostrea ...

  2. How to: Add SharePoint 2010 Search Web Parts to Web Part Gallery for Upgraded Site Collections

    When you upgrade to Microsoft SharePoint Server 2010, some of the new SharePoint Enterprise Search W ...

  3. IIS7.5 HTTP 错误 500.19 - Internal Server Error 问题的解决方案

    昨天在 windows 7 下用 IIS 7.5 运行一个以前用 .NET Framework 3.5 写的项目,发现总是出现 500.19 错误,如下: 百度了好久,没找到解决问题确切的答案,我也知 ...

  4. SQL Server备份事务日志结尾(Tail)

    原文:http://blog.csdn.net/tjvictor/article/details/5256906   事务日志结尾经常提交数据库未备份的事务日志内容.基本上,每一次你执行事务日志备份时 ...

  5. WSDL 文档解析

    学习webservice,就离不了WSDL文档,他是我们开发WebService的基础,虽说,现在现在有许多WebService的开源框架使得我们可以根据WSDL生成客户端代码,但是,了解WSDL文档 ...

  6. LightOj 1096 - nth Term (矩阵快速幂,简单)

    题目 这道题是很简单的矩阵快速幂,可惜,在队内比赛时我不知什么时候抽风把模版中二分时判断的 ==1改成了==0 ,明明觉得自己想得没错,却一直过不了案例,唉,苦逼的比赛状态真让人抓狂!!! #incl ...

  7. C# 委托的”四步走“

    看了一本<深入了解C#>感觉很不错,对于委托的讲解,给大家摘录了下来! 1.什么是委托 我的拙见:委托就是将方法作为参数,进行传递的 书中的记载:将某种行为“包含”在一个对象中,这个对象可 ...

  8. 【转载】关于ActionContext.getContext().getParameters()获值问题

    ActionContext.getContext().getParameters():一个学员问题的解答 2012-11-12 15:12:05|  分类: 默认分类 |  标签:struts2   ...

  9. java--依赖、关联、聚合和组合之间区别的理解

    在学习面向对象设计对象关系时,依赖.关联.聚合和组合这四种关系之间区别比较容易混淆.特别是后三种,仅仅是在语义上有所区别,所谓语义就是指上下文环境.特定情景等. 依赖(Dependency)关系是类与 ...

  10. Hibernate逍遥游记-第13章 映射实体关联关系-005双向多对多(使用组件类集合\<composite-element>\)

    1. <?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hi ...