Reducing running time in image processing
5 views (last 30 days)
Show older comments
I wrote a code to find circles in an image by using imfindcircles and do some other calculations on the detected circles. I plan to apply the code to 250000 images. My current code takes 0.8 seconds per image. Processing of each image is completely independent from other images. I am aware of parfor commands but I do my best not to use it because my code is complex enough and I do not like to make it more complex. Is there any way that I can run the script in a parallel way to reduce the total time (and not the running time for each which is 0.8 seoconds)? It should be noted that in some parts of the code I take advantage of GPU as well.
0 Comments
Accepted Answer
Walter Roberson
on 31 Aug 2013
parfor() and related commands such as spmd() are the main approach. Otherwise, especially if you are on Linux or OS-X, run a script that hives off a number of different MATLAB processes, each with slightly different parameters. Though if you are keeping a GPU busy, it is not certain that running multiple such routines would be any faster.
The usual method is to (A) optimize the algorithm; and (B) vectorize the code.
0 Comments
More Answers (0)
See Also
Categories
Find more on Big Data Processing in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!