How to use Perl to parse very big file without DB support?

I am parsing lot of huge log files now. Just wander how to speed things up. I have to read in 3 files at the same time. One file is huge, 300+MB. I have 1.5GB RAM now. The process looks damn slow.

Anyone has good idea?

所有跟帖: 

Ever heard about mergesort? -ohlalala- 给 ohlalala 发送悄悄话 (0 bytes) () 08/21/2006 postreply 16:45:02

I haven't thought of sorting yet -德州老外- 给 德州老外 发送悄悄话 (129 bytes) () 08/21/2006 postreply 16:51:41

回复:I haven't thought of sorting yet -ohlalala- 给 ohlalala 发送悄悄话 (78 bytes) () 08/21/2006 postreply 16:53:13

be more specific pls -德州老外- 给 德州老外 发送悄悄话 (0 bytes) () 08/21/2006 postreply 17:02:09

哈哈,我知道怎么玩了 -德州老外- 给 德州老外 发送悄悄话 (0 bytes) () 08/21/2006 postreply 17:33:00

老外不赖,学得挺快 -ohlalala- 给 ohlalala 发送悄悄话 (0 bytes) () 08/21/2006 postreply 18:44:20

速度还是提不上去,但是内存占用少多了。:D -德州老外- 给 德州老外 发送悄悄话 (0 bytes) () 08/21/2006 postreply 19:52:11

请您先登陆,再发跟帖!