Is there a faster way than str2double() to convert from a string array into a matrix containing doubles?

11 views (last 30 days)
Hi, i am working with large .txt files, that I imported as a string array. A big part of this .txt file contains numeric values, that I want to convert to doubles. Since the array is sufficiently large (500.000 x 25), it takes MATLAB very long to convert these strings into doubles using str2double(). Is there a faster way to convert a String array into a numeric matrix?
  8 Comments
per isakson
per isakson on 1 Dec 2017
  • "mport tool, it works fine.. I don't understand why" you didn't give the gui a helping hand?
  • (500000*25)*8/1e6 makes 100MB, which shouldn't be a problem
  • See Import Large Text File Data in Blocks

Sign in to comment.

Accepted Answer

Jan
Jan on 1 Dec 2017
Edited: Jan on 6 Dec 2017
Importing the strings at first is an indirection. The structure of the file looks easy, so what about using fscanf?
fid = fopen(FileName, 'r');
line1 = fgetl(fid);
line2 = fgetl(fid);
fgetl(fid);
Head = cell(1e6, 1);
Data = cell(1e6, 1); % Pre-allocate
iData = 0;
while ~feof(fid)
iData = iData + 1;
Head = fscanf(fid, '%s'); % Or: strrep(fgetl(fid), ';', '')
Data{iData} = fscanf(fid, '%g;%g;%g;%g', [4, 25]);
end
Head = Head(1:iData);
Data = Data(1:iData);
fclose(fid);
Note that text files are useful, if they are edited or read by a human. Storing 500.000 x 25 numbers in text mode is a really weak design. Storing them in binary format would make the processing much more efficient.

More Answers (1)

Renwen Lin
Renwen Lin on 3 Mar 2019
Edited: per isakson on 5 Mar 2019
  3 Comments
Jan
Jan on 17 Feb 2021
You are right. The OP had speed problems and thought that a faster STR2DOUBLE solves the problem. But avoiding the need to call STRDOUBLE is even faster.
The FEX submission suffers from some severe conversion problems:
str2doubleq('Inf') % NaN instead of Inf
str2doubleq('.i5') % 5 instead of NaN
str2doubleq('i') % 0 instead of 0 + 1i
str2doubleq('1e1.4') % 0.4 instead of NaN
str2doubleq('--1') % -1 instead of NaN
s = '12345678901234567890';
str2doubleq(s) - str2double(s) % 2048
s = '123.123e40';
str2doubleq(s) - str2double(s) % 1.547e26
str2double('2.236')-str2doubleq('2.236') % is not 0 ('2.235' is fine)
str2double('1,1')-str2doubleq('1,1') % 9,9 instead 0
isreal(str2doubleq('1')) % 0 instead of 1
str2double('2.236')-str2doubleq('2.236') % is not 0 ('2.235' is fine)
str2double('1,1')-str2doubleq('1,1') % 9,9 instead 0
A part of the speed up is based on a missing memory cleanup. This function leaks memory, because it allocates strings by mxArrayToString without free'ing it. With large cells this exhausts GB of RAM in seconds and you habve to restart Matlab to free it.
This tool is fast, but not reliably enough for scientific or productive work.

Sign in to comment.

Categories

Find more on Data Type Conversion in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!