Main Content

bagOfFeatures

Bag of visual words object

Description

You can construct a bag of visual words for use in image category classification, image retrieval, or loop closure detection in visual simultaneous localization and mapping (vSLAM).

Creation

Description

example

bag = bagOfFeatures(imds) returns a bag of features object. The bag output object is generated using samples from the imds input. By default, the visual vocabulary is created from SURF features extracted from images in imds.

bag = bagOfFeatures(imds,'CustomExtractor',extractorFcn) returns a bag of features that uses a custom feature extractor function to extract features from images in imds. extractorFcn is a function handle to a custom feature extraction function.

bag = bagOfFeatures(imds,Name=Value) specifies options using one or more name-value arguments in addition to any combination of arguments from previous syntaxes. For example, bag = bagOfFeatures(imds,Verbose=true) additionally sets Verbose to true.

This object supports parallel computing using multiple MATLAB® workers. Enable parallel computing from the Computer Vision Toolbox Preferences dialog box. To open Computer Vision Toolbox™ preferences, on the Home tab, in the Environment section, click Preferences. Then select Computer Vision Toolbox.

Input Arguments

expand all

Images, specified as an ImageDatastore object. The bagOfFeatures extracts an equal number of strongest features from the images contained in the imds object. The number of strongest features is defined as:

number of strongest features = min(number of features found in each set) x StrongestFraction

The object obtains the StrongestFraction value from the StrongestFeatures property.

Custom feature extractor function, specified as a function handle. This custom function extracts features to learn the visual vocabulary of the object.

The function, extractorFcn, must be specified as a function handle for a file:

extractorFcn = @exampleBagOfFeaturesExtractor;
bag = bagOfFeatures(imds,CustomExtractor=extractorFcn)
where exampleBagOfFeaturesExtractor is a MATLAB function. For example:
function [features,featureMetrics,location] = exampleBagOfFeaturesExtractor(img)
...
The function must be on the path or in the current working directory. The arguments are defined as:

Argument Input/OutputDescription
imgInput
  • Binary, grayscale, or truecolor image.

  • The input image is from the image set that was originally passed into bagOfFeatures.

featuresOutput

  • A binaryFeatures object.

  • An M-by-N numeric matrix of image features, where M is the number of features and N is the length of each feature vector.

  • The feature length, N, must be greater than zero and be the same for all images processed during the bagOfFeatures creation process.

  • If you cannot extract features from an image, supply an empty feature matrix and an empty feature metrics vector. Use the empty matrix and vector if, for example, you did not find any keypoints for feature extraction.

  • Numeric, real, and nonsparse.

featureMetricsOutput

  • An M-by-1 vector of feature metrics indicating the strength of each feature vector.

  • Used to apply the 'SelectStrongest' criteria in bagOfFeatures framework.

  • Numeric, real, and nonsparse.

locationOutput

  • An M-by-2 matrix of 1-based [x y] values.

  • The [x y] values can be fractional.

  • Numeric, real, and nonsparse.

For more details on the custom extractor function and its input and output requirements, see Create a Custom Feature Extractor.

You can open an example function file, and use it as a template by typing the following command at the MATLAB command-line:

edit('exampleBagOfFeaturesExtractor.m')

Properties

expand all

Custom feature extractor function, specified as a handle to a function. The custom feature extractor function extracts features used to learn the visual vocabulary for bagOfFeatures. You must specify 'CustomExtractor' and the function handle, extractorFcn, to a custom feature extraction function.

The function, extractorFcn, must be specified as a function handle for a file:

extractorFcn = @exampleBagOfFeaturesExtractor;
bag = bagOfFeatures(imds,CustomExtractor=extractorFcn)
where exampleBagOfFeaturesExtractor is a MATLAB function such as:
function [features,featureMetrics] = exampleBagOfFeaturesExtractor(img)
...
The function must be on the path or in the current working directory.

For more details on the custom extractor function and its input and output requirements, see Create a Custom Feature Extractor. You can open an example function file, and use it as a template by typing the following command at the MATLAB command-line:

edit('exampleBagOfFeaturesExtractor.m')

Vocabulary tree properties, specified as a two-element vector in the form [numLevels branchingFactor]. numLevels is an integer that specifies the number of levels in the vocabulary tree. branchingFactor is an integer that specifies a factor to control the amount the vocabulary can grow at successive levels in the tree. The maximum number of visual words represented by the vocabulary tree is branchingFactor^numLevels. Typical values for numLevels is between 1 and 6. Typical values for branchingFactor is between 10 and 500. Use an empirical analysis to select optimal values.

Increase the branching factor to generate a larger vocabulary. Increasing the vocabulary improves classification and image retrieval accuracy, but will also increase the time to encode images. You can use a vocabulary tree with multiple levels to create vocabularies on the order of 10,000 visual words or more. A multilevel tree reduces the time required to encode images with large vocabularies, but will take longer to create. You can use a tree with one level for vocabularies that contain only 100 - 1000 visual words.

Fraction of strongest features, specified as the comma-separated pair consisting of 'StrongestFeatures' and a value in the range [0,1]. The value represents the fraction of strongest features to use from each label in the imds input.

Enable progress display to screen, specified as the comma-separated pair consisting of 'Verbose' and the logical true or false.

Selection method for picking point locations for SURF feature extraction, specified as the comma-separated pair consisting of 'PointSelection' and either "Grid" or "Detector". There are two stages for feature extraction. First, you select a method for picking the point locations, (SURF "Detector" or "Grid"), with the PointSelection property. The second stage extracts the features. The feature extraction uses a SURF extractor for both point selection methods.

When you set PointSelection to "Detector", the feature points are selected using a speeded up robust feature (SURF) detector. Otherwise, the points are picked on a predefined grid with spacing defined by 'GridStep'. This property applies only when you are not specifying a custom extractor with the CustomExtractor property.

Grid step size in pixels, specified as the comma-separated pair consisting of "Grid" and an 1-by-2 [x y] vector. This property applies only when you set PointSelection to "Grid" and you are not specifying a custom extractor with the CustomExtractor property. The steps in the x and y directions define the spacing of a uniform grid. Intersections of the grid lines define locations for feature extraction.

Patch size to extract upright SURF descriptor, specified as the comma-separated pair consisting of 'BlockWidth' and a 1-by-N vector of N block widths. This property applies only when you are not specifying a custom extractor with the CustomExtractor property. Each element of the vector corresponds to the size of a square block from which the function extracts upright SURF descriptors. Use multiple square sizes to extract multiscale features. All the square specified are used for each extraction points on the grid. This property only applies when you set PointSelection to "Grid". The block width corresponds to the scale of the feature. The minimum BlockWidth is 32 pixels.

Orientation of SURF feature vector, specified as a logical scalar. This property applies only when you are not specifying a custom extractor with the CustomExtractor property. Set this property to true when you do not need to estimate the orientation of the SURF feature vectors. Set it to false when you need the image descriptors to capture rotation information.

Object Functions

encodeCreate histogram of visual word occurrences

Examples

collapse all

Create an image datastore.

setDir  = fullfile(toolboxdir('vision'),'visiondata','imageSets');
imds = imageDatastore(setDir,'IncludeSubfolders',true,'LabelSource',...
    'foldernames');

Create the bag of features. This process can take a few minutes.

bag = bagOfFeatures(imds);
Creating Bag-Of-Features.
-------------------------
* Image category 1: books
* Image category 2: cups
* Selecting feature point locations using the Grid method.
* Extracting SURF features from the selected feature point locations.
** The GridStep is [8 8] and the BlockWidth is [32 64 96 128].

* Extracting features from 12 images...done. Extracted 230400 features.

* Keeping 80 percent of the strongest features from each category.

* Creating a 500 word visual vocabulary.
* Number of levels: 1
* Branching factor: 500
* Number of clustering steps: 1

* [Step 1/1] Clustering vocabulary level 1.
* Number of features          : 184320
* Number of clusters          : 500
* Initializing cluster centers...100.00%.
* Clustering...completed 26/100 iterations (~1.57 seconds/iteration)...converged in 26 iterations.

* Finished creating Bag-Of-Features

Compute histogram of visual word occurrences for one of the images. Store the histogram as feature vector.

img = readimage(imds, 1);
featureVector = encode(bag,img);
Encoding images using Bag-Of-Features.
--------------------------------------
* Encoding an image...done.

Create an image datastore.

setDir  = fullfile(toolboxdir('vision'),'visiondata','imageSets');
imds = imageDatastore(setDir,'IncludeSubfolders',true,'LabelSource',...
    'foldernames');

Specify a custom feature extractor.

extractor = @exampleBagOfFeaturesExtractor;
bag = bagOfFeatures(imds,'CustomExtractor',extractor)
Creating Bag-Of-Features.
-------------------------
* Image category 1: books
* Image category 2: cups
* Extracting features using a custom feature extraction function: exampleBagOfFeaturesExtractor.

* Extracting features from 12 images...done. Extracted 230400 features.

* Keeping 80 percent of the strongest features from each category.

* Creating a 500 word visual vocabulary.
* Number of levels: 1
* Branching factor: 500
* Number of clustering steps: 1

* [Step 1/1] Clustering vocabulary level 1.
* Number of features          : 184320
* Number of clusters          : 500
* Initializing cluster centers...100.00%.
* Clustering...completed 19/100 iterations (~1.37 seconds/iteration)...converged in 19 iterations.

* Finished creating Bag-Of-Features
bag = 
  bagOfFeatures with properties:

      CustomExtractor: @exampleBagOfFeaturesExtractor
       NumVisualWords: 500
       TreeProperties: [1 500]
    StrongestFeatures: 0.8000

References

[1] Csurka, G., D. Christopher, F. Lixin, W. Jutta, and B. Cédric. "Visual categorization with bags of keypoints." Workshop on statistical learning in computer vision, ECCV, 2004, pp. 1-2.

[2] Nister, D., and H. Stewenius. "Scalable Recognition with a Vocabulary Tree." Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2006, vol. 2, pp. 2161–2168.

Extended Capabilities

Version History

Introduced in R2014b