5

I generated a texture image like this

I have to compare two textures. I have used histogram comparison method.

image_file = 'output_ori.png'
img_bgr = cv2.imread(image_file)
height, width, channel = img_bgr.shape

hist_lbp = cv2.calcHist([img_bgr], [0], None, [256], [0, 256])
print("second started")

image_fileNew = 'output_scan.png'
img_bgr_new = cv2.imread(image_fileNew)
height_new, width_new, channel_new = img_bgr_new.shape
print("second lbp")

hist_lbp_new = cv2.calcHist([img_bgr_new], [0], None, [256], [0, 256])

print("compar started")

compare = cv2.compareHist(hist_lbp, hist_lbp_new, cv2.HISTCMP_CORREL)

print(compare)

But this method is not effective. It shows similar results for two different image textures. Also it is not showing too much of variation to identify Print & Scan effect. How do I compare the textures? I thought of analysing the GLCM characteristics.

import cv2
import numpy as np
from skimage.feature import greycomatrix

img = cv2.imread('images/noised_img1.jpg', 0)

image = np.array(img, dtype=np.uint8)
g = greycomatrix(image, [1, 2], [0, np.pi/2], levels=4, normed=True, symmetric=True)
contrast = greycoprops(g, 'contrast')
print(contrast)

In this method, I am getting the output as 2*2 matrix. How do I compare two matrices of several features like contrast, similarity, homogeneity, ASM, energy and correlation?

COMMENT CLARIFICATION

import numpy as np
from PIL import Image

class LBP:
    def __init__(self, input, num_processes, output):
        # Convert the image to grayscale
        self.image = Image.open(input).convert("L")
        self.width = self.image.size[0]
        self.height = self.image.size[1]
        self.patterns = []
        self.num_processes = num_processes
        self.output = output

    def execute(self):
        self._process()
        if self.output:
            self._output()

    def _process(self):
        pixels = list(self.image.getdata())
        pixels = [pixels[i * self.width:(i + 1) * self.width] for i in range(self.height)]

        # Calculate LBP for each non-edge pixel
        for i in range(1, self.height - 1):
            # Cache only the rows we need (within the neighborhood)
            previous_row = pixels[i - 1]
            current_row = pixels[i]
            next_row = pixels[i + 1]

            for j in range(1, self.width - 1):
                # Compare this pixel to its neighbors, starting at the top-left pixel and moving
                # clockwise, and use bit operations to efficiently update the feature vector
                pixel = current_row[j]
                pattern = 0
                pattern = pattern | (1 << 0) if pixel < previous_row[j-1] else pattern
                pattern = pattern | (1 << 1) if pixel < previous_row[j] else pattern
                pattern = pattern | (1 << 2) if pixel < previous_row[j+1] else pattern
                pattern = pattern | (1 << 3) if pixel < current_row[j+1] else pattern
                pattern = pattern | (1 << 4) if pixel < next_row[j+1] else pattern
                pattern = pattern | (1 << 5) if pixel < next_row[j] else pattern
                pattern = pattern | (1 << 6) if pixel < next_row[j-1] else pattern
                pattern = pattern | (1 << 7) if pixel < current_row[j-1] else pattern
                self.patterns.append(pattern)

    def _output(self):
        # Write the result to an image file
        result_image = Image.new(self.image.mode, (self.width - 2, self.height - 2))
        result_image.putdata(self.patterns)
        result_image.save("output.png")

I generated texture with this code. I have texture and I have methods to calculate the texture properties, but the question is how to identify the similarity between two textures.

8
  • 1
    You use "LBP" in your question title. LBP (or any of its variants) is a really good way to compare textures. Why does the body of the question not talk at all about LBP? Instead you talk about histograms, which don't characterize texture at all, and GLCM, which has been blown out of the water by LBP and other methods in every test over the last 25 years. Commented Jul 9, 2018 at 15:27
  • I generated textures using LBP. Now the task is to compare two textures. For comparing I spoke about GLCM. From your answer, what I can understand is: we can compare textures using LBP. Am I right? And how it is? Commented Jul 10, 2018 at 4:45
  • LBP (Local Binary Patterns) is a method to characterize textures. I don't know how you used them to generate textures? If you Google for "LBP textures" you'll get lots of papers, software documentation, etc. etc. that will help you further. Both SciKit and OpenCV (which you added tags for) have implementations. Commented Jul 10, 2018 at 6:17
  • 1
    This is what I understand you are doing: you start off with some image, compute the LBP, but instead of accumulating the value for each pixel in a histogram, you output it as a (binary?) image. Then you want to use a different texture characterization technique to quantify these. This explains why you were using a histogram. Commented Jul 10, 2018 at 13:47
  • 1
    I suggest you compute the histogram of LBP.patterns (don’t convert to an image, as the data type conversion could cause problems!). Look into the LBP literature. There are methods to further reduce the 256 different values, which might be useful in your case, as the resulting features will have many fewer dimensions and will be easier to compare. Commented Jul 10, 2018 at 13:48

1 Answer 1

4

Suppose you have two classes, for example couscous and knitwear, and you wish to classify an unknown color image as either couscous or knitwear. One possible method would be:

  1. Converting the color images to grayscale.
  2. Computing the local binary patterns.
  3. Calculating the normalized histogram of local binary patterns.

The following snippet implements this approach:

import numpy as np
from skimage import io, color
from skimage.feature import local_binary_pattern

def lbp_histogram(color_image):
    img = color.rgb2gray(color_image)
    patterns = local_binary_pattern(img, 8, 1)
    hist, _ = np.histogram(patterns, bins=np.arange(2**8 + 1), density=True)
    return hist

couscous = io.imread('https://i.sstatic.net/u3xLI.png')
knitwear = io.imread('https://i.sstatic.net/Zj14J.png')
unknown = io.imread('https://i.sstatic.net/JwP3j.png')

couscous_feats = lbp_histogram(couscous)
knitwear_feats = lbp_histogram(knitwear)
unknown_feats = lbp_histogram(unknown)

Then you need to measure the similarity (or dissimilarity) between the LBP histogram of the unknown image and the histograms of the images that represent the two considered classes. Euclidean distance between histograms is a popular dissimilarity measure.

In [63]: from scipy.spatial.distance import euclidean

In [64]: euclidean(unknown_feats, couscous_feats)
Out[64]: 0.10165884804845844

In [65]: euclidean(unknown_feats, knitwear_feats)
Out[65]: 0.0887492936776889

In this example the unknown image will be classified as knitwear because the dissimilarity unknown-couscous is greater than the dissimilarity unknown-knitwear. This is in good agreement with the fact that the unknown image is actually a different type of knitwear.

import matplotlib.pyplot as plt

hmax = max([couscous_feats.max(), knitwear_feats.max(), unknown_feats.max()])
fig, ax = plt.subplots(2, 3)

ax[0, 0].imshow(couscous)
ax[0, 0].axis('off')
ax[0, 0].set_title('Cous cous')
ax[1, 0].plot(couscous_feats)
ax[1, 0].set_ylim([0, hmax])

ax[0, 1].imshow(knitwear)
ax[0, 1].axis('off')
ax[0, 1].set_title('Knitwear')
ax[1, 1].plot(knitwear_feats)
ax[1, 1].set_ylim([0, hmax])
ax[1, 1].axes.yaxis.set_ticklabels([])

ax[0, 2].imshow(unknown)
ax[0, 2].axis('off')
ax[0, 2].set_title('Unknown (knitwear)')
ax[1, 2].plot(unknown_feats)
ax[1, 1].set_ylim([0, hmax])
ax[1, 2].axes.yaxis.set_ticklabels([])

plt.show(fig)

Plots

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.