Clear Filters
Clear Filters

Why does Matlab need so much memory to read in a .csv file?

9 views (last 30 days)
I have a 370mb .csv file that I want to read in to Matlab. I cannot do it, because the program sequesters all of my available memory (approximately 5gb) and is still not done with the import. This is as large file, but it is not that big. Is this normal?
  2 Comments
Matt J
Matt J on 22 Sep 2013
It doesn't sound normal. Your code, if you show it, could tell us why.
Randy
Randy on 22 Sep 2013
My "code" consists of hitting "import data", browsing to the file, selecting the vectors I want from the preview, and hitting "import selection". I do not know what code this runs (it does not show up in the command history).

Sign in to comment.

Answers (1)

Matt J
Matt J on 22 Sep 2013
Just a guess, but if your file is full of numeric data, each data value can become about 8 times as large when converted from text to doubles. Seeing your actual code would provide more clues, though, as I commented above.
  3 Comments
Matt J
Matt J on 23 Sep 2013
Edited: Matt J on 23 Sep 2013
If your file is 370MB and contains 20 million elements, it means that each element consumes about 18 bytes on average. That sounds a bit weird to me. Are these 18 digit numbers?
Anyway, you could try xlsread, dlmread, or textscan to see if you can read 1 column at a time and see if it makes a difference.
Image Analyst
Image Analyst on 23 Sep 2013
18 bytes per number is fine. Each number could be 18 characters if it had, say for example, 3 numbers, a decimal place, 12 digits of mantissa, a comma, and a trailing space, like this "123.123456789012, " That's 18 ASCII characters and if there were about 20 million ASCII sequences like that in the CSV text file, you'd get 370 MB.

Sign in to comment.

Categories

Find more on Large Files and Big Data in Help Center and File Exchange

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!