Open Access

Quantitative model construction of colour application in modern ceramic art creation

  
Feb 05, 2025

Cite
Download Cover

Introduction

With the rapid development of the economy brought about by high technology in modern society, people have undergone drastic changes from living habits to aesthetic orientation, from ideology to way of thinking. The fast-paced and efficient modern life and the strong sense of social competition have caused people to pursue a more novel and exciting outlook on life mentality [1]. Changes in lifestyle and social structure have led to the development of literature, art, and other humanities in a modernized form. People’s aesthetic requirements for art are also changing and renewing, and it has become the requirement of the times to modernize art creation, to break free from the shackles, and to develop, find, excavate, and explore a new world with new concepts [2-3]. Artistic creation should have the aesthetic characteristics of the times, form the style of the times, and show the elegance of the times, which requires art creators to use a new way of artistic thinking, new art form language to break through the old, programmed traditional concepts and rules and regulations. Modern ceramics is the product of modern society and modern civilization, closely related to the development of modern science and technology, the improvement of material life and modern people’s aesthetic interests and spiritual needs change. The modern ceramic artist carries personal thoughts, personal feelings, personal aesthetic materialization, visualization of the medium [4-6]. Therefore, in modern ceramic creation in the way of thinking and form language innovation, we must change the traditional ceramic concept of pottery and traditional ceramic aesthetic interest, and to solve such problems, modern ceramics from the material, modeling, decorative, technology and art concepts and other aspects of re-understanding and research [7-8].

Color is an important element and artistic language for artists to design and express their works, which can make the works more artistically infectious and shocking. The beauty of color language can be widely used in various fields nowadays, which actually reflects the maturity and benign development of its aesthetic taste [9-10]. Whether it is pottery, photography, sculpture, painting, design or other fields, there are many applications of high purity color language beauty. In wonderful nature, people are often attracted by the color of everything, and the color of many scenes exists in saturation, brightness and purity in a certain proportion, and the combination has a speechless and harmonious beauty, which is admirable [11-12]. Bright and eye-catching colors are used properly in modern ceramic art creation, making the creation of strong and hot emotional colors and exuberant vitality. Ceramic creators put their feelings into the creation, breaking the conventional traditional design concepts, design concepts and cultural concepts cleverly blended through the characteristics of the clay as well as the glaze color to express their ideas to the fullest. The color presented by the performance of different [13-16]. Constructing a quantitative model through numerical calculations can easily obtain the plane geometry or color parameters and characteristics of modern ceramic art creation. Reasonable and effective management and use of color resources can more effectively help ceramic creators to understand the color development trend, prediction and development of ceramics suitable for their characteristics and the target consumer groups of the color scheme, and play a role in guiding the development of new ceramic art products [17-20].

This research focuses on color matching and application in ceramic art creation, and proposes a color quantization algorithm based on visual characteristics and a color matching measurement algorithm, respectively. In the color quantization algorithm, human visual characteristics are used to measure the similarity between colors dynamically by combining color frequency and color difference so that colors are quantized in each logical region of the combination of color frequency and color difference. A clustered feature tree is used to implement the split-merge algorithm efficiently for the quantification of ceramic colors. We propose a similarity measurement model for evaluating pairwise color matching schemes, construct a visual perception measurement method based on eye tracking technology, and propose to construct an intelligent color matching model integrating visual aesthetics based on the theoretical foundation of image translation model combined with visual aesthetics data flow. The effectiveness and performance of this paper’s color quantification algorithm and color matching measurement algorithm are verified, respectively, and a series of ceramic works, “Days on the Clouds” are designed by using them, which are compared and analyzed with A ceramic works that are not designed by applying this paper’s method in terms of eye-movement behavioral indexes, color matching preferences, and visual aesthetics evaluation.

Quantitative model of ceramic colour application

With the rapid development of industrial technology, the ceramic production industry has also emerged. The development of the traditional ceramic industry is relatively slow. In order to cope with the rapid iteration of the ceramic market products, how to quantify the application of ceramic art in the application of colour, ceramic colour to achieve efficient collocation and design has become a topic of great concern in the industry. In this regard, this study proposes a quantitative algorithm for ceramic colour based on visual characteristics, and further proposes a method for measuring and evaluating the fusion of ceramic colour matching on the basis of quantitative ceramic colour data.

Ceramic colour quantification algorithms
Similarity criteria

The similarity between colors in a color quantification algorithm based on visual properties includes color frequency and color difference. Colour frequency can be obtained by scanning the whole image, while chromatic aberration is the distance between two colours, which will be defined differently in different colour models. Various colour models have been proposed, the RGB colour model is the most widely used one in practice, and the position of each colour in this colour model can be determined by the 3D spatial coordinates (R,G,B) [21]. Each point in the colour space can be represented X={Rg,Gx,Bx}T by a vector X, and the similarity of two colours X and Y can be represented D(X,Y)=E=XY by the difference E of their vectors in terms of a parameter, the distance D(X,Y) , where D(X,Y) is usually called the colour distance, and is the parameter of the vector, e.g. the Euclidean parameter D(X,Y) is [22]: D(x,y)=(RxRy)2+(GxGy)2+(BxBy)2

Intuitively, the smaller the number of paradigms, the shorter the distance between two colours in the colour space, the closer the two colours are. However, since the RGB colour space is not a uniform linear space, the colour classification obtained according to the shortest distance between colours does not fully comply with the classification of colours by human vision. The International Commission on Illumination (CIE) in 1976 defined three uniform colour space (L*,a*,b*) , of which L* for metric brightness, a*, b* for metric chromaticity, respectively, are defined as follows: L*=116(Y/Y0)1/316

When Y/Y0>0.01 : a*=500[(X/X0)1/3(Y/Y0)1/3] b*=200[(Y/Y0)1/3(Z/Z0)1/3]

Where X0 = 95.0, Y0 = 100.0, Z0 = 108.9, X, Y, Z have the following transformation relationship with the RGB chromaticity space: X=100*(0.430R+0.342G+0.178B)/255 Y=100*(0.300R+0.590G+0.110B)/255 Z=100*(0.020R+0.0.130G+0.939)/255

Theoretically, it is possible to convert the RGB colour space to a uniform colour space, then perform colour quantisation in the uniform colour space, and then convert the quantisation results to the RGB colour space to generate a quantised image, but a more practical solution would be to adjust the RGB colour space to partially compensate for its non-uniformity. For example, a distance formula using a weighting factor in RGB colour space: D(x,y)=w1(RxRy)2+w2(GxGy)2+w3(BxBy)2

Where w1, w2, w3 are weighting coefficients, which are taken according to the sensitivity of the human eye to the three primary colours of RGB. In the colour space, each colour in a colour image is a three-dimensional pattern sample, thus allowing for colour pattern classification using pattern recognition theory.

In this paper, we will mainly discuss the quantization algorithms from 24-bit true colour images to N colour images in the most commonly used RGB colour space, in which the consistent effect with other colour spaces is obtained by setting the weighting coefficients in the RGB colour space. In fact, most of the algorithms discussed are not limited by these conditions and can be easily applied to other colour spaces.

The following definitions are made using equation (1).

Definition 1, defines function D(ci,cj) as the geometric distance of colour vectors ci, cj.

Definition 2, define function N(ci) as the number of pixels of colour vector ci.

Definition 3, define function Max_D(ci,cj) as the geometric distance of the colour vectors ci, cj is maximized and it satisfies: D(ci,cj)>D(ci,ck),WhereciP,cjQckQandcjck

P is the initial set of clustering centres for the dominant image style and Q is the set of colour samples.

Definition 4, defines the function S(ci,cj) as the similarity between the colour vectors ci, cj with the expression: S(ci,cj)=(N(ci)N(cj)N(ci)+N(cj))pD(ci,cj)q

A similarity that incorporates frequency and colour information is more representative of the relationship between colour samples. The product of N(ci) and N(cj) in the numerator part of Eq. (10) makes the colour samples containing a larger number of pixel points, and in the case of the same geometric distance, the larger S(ci,cj) is, the less similar they are in favour of becoming the centre of clustering, which makes the quantitative results conform to the subjective feeling of human beings, and p and q are used to adjust the size of their respective roles of the pixel point frequency and the geometric distance of the colours, which can be adjusted by adjusting the sizes of p and q to satisfy different requirements for the style and key details of the whole image.

Clustering criteria

After finding out the original clustering centre based on the image, the remaining colour samples are clustered according to the rule of maximum similarity with the clustering centre, and at the same time, the most appropriate clustering centre is also dynamically selected.

Definition 1, the clustering centre (representative or palette colour) of each clustering domain draws on the law of centre (centre of mass) calculation in mechanics with the expression: Ai=1alicoloriN(al)alicoloriali*N(al)i=1,2K;l=1,2,..L

Definition 2, The colour vectors in Eq. a1icolori , Ai are the representative colours of the clustering domain colori(i=1,2K) , i.e., the K best representative colours can be obtained as palettes from colour images possessing a variety of colours, and thus quantitative results can be obtained.

Data structure

Before introducing the data structure, the concepts of cluster features and cluster feature tree are briefly introduced [23].

A cluster feature CF={N,LS,SS} , is a triad where N is the number of points, LS=i=1NXi , is a linear sum of N points, which reflects the centre of gravity of the cluster, and SS=i=1NXi , is a sum of squares of N points, which reflects the size of the diameter of the cluster, and the smaller SS is, the tighter the cluster is clustered. The clustering feature summarises the information about a cluster of individuals so that the representation of a cluster of points can be summarised as a corresponding clustering feature, and there is no need to represent it as a specific set of points. In this way, the clustering feature represents a cluster of data points correctly, efficiently, and adequately, while saving a significant amount of space. If a given cluster is formed by merging Cluster 1 and Cluster 2, then the new cluster: CF=CF1+CF2={N1+N2,LS1+LS2,SS1+SS2}

A clustered feature tree CF-Tree is a multinomial balanced tree similar to a B + -tree that satisfies two conditions. The two conditions are, respectively, the branching factor B and the cluster diameter T. The branching factor specifies the maximum number of children of each non-leaf node in the tree. The cluster diameter reflects the limit on the size of the diameter (or radius) of a cluster of points, which is too large to be clustered into one class. The size of a clustered feature tree is a function of the cluster diameter T. The smaller the value of the cluster diameter restriction (which requires a high degree of similarity between the individual data items in the class), the larger the tree will be. Conversely, the smaller the tree. Therefore, the size of the clustered feature tree can be modified by changing the size of the cluster diameter limit.

The non-leaf nodes of the clustered feature tree have B child items, which are in the format: [CFi,Childi]i=1,2,,B

Childi is a pointer to the i nd child node, and CFi summarises the clustering characteristics of the subtree represented by the i th child node. The leaf node has L child entries, which are of the form: [CFi],i=1,2,,L

Ceramic colour matching measurement algorithm
Similarity Metrics and Visual Perception Measures

Minimum colour difference model and palette similarity measurement

In order to scientifically and effectively describe the similarity between palettes made of multiple colour combinations, the traditional method of calculating the colour difference of palettes is improved, and a minimum colour difference model combining positional information is proposed. The method is to avoid the difference in calculation results due to the difference in position information by introducing the strategy of intermediate palettes for pairs of palettes, and the specific calculation methods are as follows.

Step 1, Assuming that the number of primary colours in the source palette and the target palette are both n, the source palette is denoted as palette P1, iterating over all the colours in palette P1 and calculating the Lab values of all the colours in palette P1. The target palette is noted as palette P2, traverse all the colours of the target palette P2 and calculate the Lab values of all the colours of P2.

Step 2, use all the colours in the target palette P2 to calculate the chromatic aberration of the first colour of the source palette P1, record all the results of the calculations and their corresponding colours in P2, and use the colour block corresponding to the smallest value of the results of the calculations as the first colour of the intermediate palette, which is recorded as palette P3.

Step 3, a cyclic operation is carried out for the processing of step 1 and step 2, respectively, using all the colours of P2 and then calculating the colour difference of the remaining n−1 colours of P1 one by one, and determining the remaining colours of P3 in accordance with the method in step 1, so as to ultimately obtain a complete intermediate palette of colours P3.

Step 4, calculate the average colour difference value between P1 and P3, which is calculated as defined below: m1=i=1nΔEin

where

i - the i nd pair of colours of the paired palette.

n - the total number of palette colours.

ΔEi - the colour difference corresponding to the i th pair of colours.

m1 - the average colour difference between palette P1 and P3.

Step 5, the n colours of the source palette are used to obtain the colour difference of all the colours of the target palette in turn and a new intermediate auxiliary palette is obtained as P4 according to the processing method in step 2.

Step 6, the average colour difference between P2 and P4 is calculated using the method in step 3 and is recorded as m2.

Step 7, the calculation method proposed in this section applicable to evaluating the similarity of the colour palette is defined as follows: m=(m1+m2)/2

The calculated colour difference results are normalised and the normalisation method and similarity calculation for paired palettes are defined below: M=mm_minm_maxm_min S=1M

In the formula.

m, m_min, m_max - represent the minimum colour difference of the current paired palette, the minimum colour difference among all palettes, and the maximum colour difference, respectively.

M, S - the normalised result and the palette similarity measure.

Similarity measure model based on feature fusion

Since the above palette similarity measure and image content based structural similarity measure represent the main colour information and image content information respectively, which have relatively ideal independence, the feature level fusion is performed on them using variable weights to get the result based on similarity calculation, and the method of feature fusion is defined in the following equation: SIM=[ω*S+(1ω)*SSIM]×100%

Where.

ω - variable weights.

S - Palette similarity measure.

SSIM - Image similarity measure.

SIM - Overall similarity measure.

Visual perception measure based on eye-tracking technology [24]

The eye movement behaviour metrics of each sample were normalized, and the computational steps of the normalisation process are specified in equation (17). Subsequently, the three indicators were weighted and fused, and it is worth noting that, since the first gaze time shows a negative correlation with attractiveness in user interaction and evaluation tasks (the smaller the first gaze time, the higher the attractiveness), the normalised theoretical maximum is used to subtract the normalised first gaze time, and the weighting formula is shown in equation (20): V=αT+βN+γ(1F)

Where.

V - visual perception data.

T, N, F - represent the normalised mean gaze time, number of levelling points and first gaze time respectively.

α, β, γ - are weights, all of which are taken as 1/3 in this chapter.

Step 5, calculate the ratio of the visual perception data of the recoloured image to its corresponding source image, which is the visual perception measure, and the formula is shown in equation (21): τ=VV

Where.

t - visual perception measure.

V - Visual perception data of the source image.

V′ - Visual perception data of the target image.

Design of colour matching algorithms incorporating visual aesthetics

Visual Aesthetics

In the process of quantifying visual aesthetics using eye-movement behavioural measures, the three most important eye-movement behavioural metrics are average gaze time, average number of gaze points, and first gaze time, which reflect the visual comfort, visual attractiveness, and visual impact of the test image or video, respectively. Taking the average gaze time as an example, its calculation formula in the eye tracking experiment is shown in equation (22). T=i=1nT(AOI)n

Where,

n - number of test images or videos.

T(AOI) - test time for dividing the region of interest in the eye movement experiment.

Constructing visual aesthetics data flow using eye movement behaviour indicators, normalised pre-processing is performed on the above three eye movement behaviour indicators, in which the average gaze time and the average number of gaze points are positively correlated with the degree of visual aesthetics liking, taking the average gaze time as an example, and the processing method is defined as follows: h=hh_minh_maxh_min

Where,

h - the current data to be processed before normalisation.

h_min, h_max - the minimum and maximum values of the average gaze time, respectively.

The first gaze time measure is negatively correlated with the degree of visual aesthetics favouritism in the interaction task, and its processing is defined as follows: j=1jj_minj_maxj_min

Where,

j - the current data to be processed before normalisation.

j_min, j_max - the minimum and maximum values of the first gaze time, respectively.

Visual aesthetic parameter W is shown in equation (25): W=(αh+βi+γj)×10

In Eq,

W - the visual aesthetic parameter of fusing three different eye movement behaviour data, taking values between [0,10] .

α, β, γ - Weights of the three different eye movement behaviour data.

Where the three weights are set differently depending on the visual task, this chapter uses α=β=γ=1/3 to calculate the visual aesthetic score W.

Image translation model

The image translation model is a conditional generative adversarial network as a generator and a discriminator, whose generator input is a real sample image x, and the output is a generated image G(x) . Since the discriminator needs to judge the authenticity of the output image of the generator, the input of the discriminator is a pair of images consisting of the generated image G(x) and the real image.

In addition, the Pix2Pix network model introduces L1 Loss to judge the image globally on the basis of CGAN, as shown in equation (26) [25-26]. G*=argminGmaxDLcGAN(G,D)+λLL1(G)

Where,

G, D - represent generator and discriminator respectively.

LL1(G) , λ - represent the weights of L1 Loss and L1 Loss respectively.

Where L1 Loss is defined as follows: LL1(G)=Ex,y,z[yG(x,z)1]

where,

G - generator.

x, y, z - denote the real sample, conditional probability, and random noise, respectively.

The loss function of Pix2Pix is shown in equation (28): L(Gmin,Dmax)=Ex~pdata(x)[logD(x|y)]+Ez~pz(z)[log(1D(G(z|y)))]+λLL1(G)

Intelligent colour matching model incorporating visual aesthetics

The loss function of the backbone network Pix2Pix is updated using the aesthetic scores of the colour palette, and the aesthetic loss function S(G) and the total loss function of the intelligent colour matching algorithm for fused visual aesthetics established in this chapter are shown in Eq. (29) and Eq. (30), respectively: S(G)=10Score L(Gmin,Dmax)=Ex~pdata(x)[logD(x|y)]+Ez~px(z)[log(1D(G(z|y)))]+λLL1(G)+γS(G)

In Eqs.

Score, 10 - denote the score of the visual aesthetic scoring model of the palette and the full score of the aesthetic evaluation, respectively.

γ - the weight of S(G) .

Ceramic colour quantification and matching measurement experiments

This chapter will focus on validating the effectiveness and performance of the ceramic color quantification algorithm and the ceramic color matching measurement algorithm proposed above.

Experiments on the quantification of ceramic colours

In this section, the peak signal-to-noise ratio PSNR and the structural similarity index SSIM are introduced as evaluation indexes for the similarity between the reconstructed image and the original image in terms of image quantification effect, i.e., the difference between the original ceramic image and the quantified ceramic image, and the k-means, DBSCAN, and Median Cut algorithms are selected for the comparison, to validate that the ceramic colour quantification algorithm based on the visual characteristics proposed in this paper has the Performance. Six ceramic images are randomly selected from the database, named Photo1~6, and quantitatively reconstructed by applying the algorithms, including the ceramic colour quantization algorithm in this paper, and the similarity between the reconstructed image and the original image is calculated from the perspectives of PSNR and SSIM, and the values are recorded so as to compare with the results of the reconstruction of other algorithms. The PSNR and SSIM values of each algorithm are shown in Fig. 1. The larger the PSNR value is, the smaller the image distortion is, and it is obvious that the PSNR values of this method are larger than those of other algorithms in the colour quantitative reconstruction of six ceramic images, with the PSNR values of 28.51, 27.17, 25.72, 30.29, 28.25, and 26.49 respectively, the closer the SSIM value is to 1, the better the quality of the image is. The better the image quality. Except in Photo2, the SSIM value of this paper’s ceramic colour quantization algorithm is 0.81, which is slightly lower than that of the k-means algorithm and Median Cut algorithm (0.91, 0.89), and the SSIM value of this paper’s method is higher than that of the other comparative algorithms in other ceramic images. The average SSIM value of this paper’s ceramic colour quantification algorithm in six ceramic images is about 0.812, which is the highest among all the algorithms, and the average SSIM values of k-means, DBSCAN, and Median Cut algorithms are 0.8, 0.685, and 0.597, respectively. Overall, the proposed algorithm for quantifying ceramic color based on visual characteristics of ceramic images has a better impact on quantification, and structurally, it is superior to other comparative algorithms. Quantification is better, with higher structural similarity and colors closer to those of the ceramics itself.

Figure 1.

PSNR and SSIM

Ceramic colour matching measurement experiment

Randomly select 20 ceramic images in the database, also named in the form of Photo1-20, and apply the ceramic color matching measurement algorithm proposed in this paper to perform color matching evaluation on each ceramic. The ceramic colour matching evaluation score of this paper’s algorithm is compared with the expert score, and the comparison results are shown in Figure 2. According to the figure, the trend of the application of this paper’s algorithm obtained by the ceramic color matching score and expert scoring results is approximately the same. In addition, the difference between the score of this paper’s algorithm and the expert’s score is up to 0.94, which is less than 1, and the difference between the two is small. This can demonstrate that the proposed ceramic color matching evaluation method proposed in this paper is effective.

Figure 2.

Color collocation evaluation

Ceramic Colour Design Practice

In the previous paper, this paper proposes a ceramic color quantification algorithm based on visual characteristics to realize color quantification in ceramic art creation and proposes a measurement and evaluation method for the color matching used in ceramic design. This study will apply the ceramic color quantification and matching evaluation method proposed in this paper to the actual art creation and design of ceramics and design a series of ceramic works, “Days on the Clouds”. The whole set of works is a 5-piece set of desktop storage ceramic designs consisting of a flower vessel, a storage jar, a pen holder and two storage trays.

In this paper, we screened subjects who met the requirements through interviews and invited them to participate in the evaluation and scoring of the series of ceramic works “Days on the Clouds” designed in this research and the series of ceramic works A (referred to as “work A”) that were not designed with the methodology of this paper. The age of the subjects was between 20 and 30 years old, and the number of subjects was 20, which included both practitioners and students in the field of design, as well as subjects who were not practicing or studying in the field of design.

Analysis of eye movement behaviour indicators

This section utilizes the eye movement experimental method by tracking the eye trajectories and data of the subjects to the ceramic stimuli of the “Days on the Clouds” series designed in this paper as a reflection of the psychological preference of the subjects against the ceramic works. The eye movement behavior data were accomplished with the help of the TobiiGlasses 2 head-mounted eye-tracking device, which can collect eye movement data with high precision and is very light in weight, so it will not cause interference to the subjects. The specific meanings represented by different eye movement data are shown in Table 1. The longer the duration of gaze and the first time of gaze, and the more gaze points there are, the higher the interest level of the subjects.

Eye movement data index

Number Eye movement indicator Meaning
1 Total fixation time The total time of the fixed point of the interest area
2 First fixation duration The duration of the first gaze point in the interest area
3 Number of fixation points The amount of attention collected in the interest area

The eye movement behavior index data of the subjects for the series of ceramic works designed in this paper, “Days on the Clouds,” and A works are shown in Table 2. As can be seen from the table, for the series of ceramic works “Days on the Clouds”, the average gaze duration and the average first gaze duration of the subjects reached 1686.101ms and 399.217ms, respectively, which were higher than those of the works of 906.385ms and 143.0875ms for the works of A. In terms of the number of gaze points, the average number of gaze points of the subjects in the series of ceramic works “Days on the Clouds” was 4.3575, compared with 3.67373 for the works of A. In terms of the number of attention points, the average number of attention points of the subjects in the series of ceramic works “Days on the Clouds” in this paper is 4.3575, while the number of attention points in the work of A is 3.6735, and the former is higher than the latter by 0.753. Obviously, the series of ceramic works designed in this paper, “Days on the Clouds”, is more capable of stimulating the interest of subjects.

Data of eye action indicators

Number Work of this article Work A
Total fixation time (ms) Number of fixation points First fixation duration (ms) Total fixation time (ms) Number of fixation points First fixation duration (ms)
1 1788.71 4.78 779.25 995.58 4.81 186.05
2 2266.18 4.55 647.69 765.44 4.66 246.82
3 150.75 1.73 149.67 615.24 2.17 284.13
4 1254.66 4.79 164.99 843.65 3.58 237.1
5 1397.32 6.02 395.21 708.04 4.09 264.03
6 1932.81 6.58 213.89 760.58 3.79 211.98
7 1348.41 3.19 299.82 624.77 2.79 378.94
8 1912.02 8.87 231.9 931.75 3.66 204.35
9 1246.89 3.09 292.69 860.48 3.98 210.64
10 1374.74 1.93 559.66 777.26 2.82 228.22
11 2851.48 7.14 528.77 772.77 3.69 180.02
12 1203.3 3.11 305.49 795.14 4.8 225.46
13 1270.85 2.25 405.02 853.43 3.93 207.99
14 3187.83 5.33 265.54 672.88 3.88 382.72
15 1152.53 2.87 234.37 794.26 2.18 338.91
16 3264.77 3.89 230.06 697.76 3.14 277.19
17 1621.44 4.31 545.37 535.19 3.14 385.27
18 1426.26 2.84 439.47 955.23 4.6 175.44
19 1428.46 4.93 602.44 784.85 4.08 246.21
20 1642.61 4.95 695.73 854.78 3.68 250.36
Average 1686.101 4.3575 399.3515 779.954 3.6735 256.0915
Colour Matching Preferences

After the subjects finished the eye movement experiment, the subjects were invited to score the color matching preferences of the series of ceramic works designed in this paper, “Days on the Clouds”, and the works of A as a comparison, and the preference scoring was based on a 10-point full-point system. The subjects’ color matching preference scoring is shown in Figure 3. As can be seen from the figure, No. 3, No. 12, and No. 17 subjects for the A works of color matching preference scores were higher than the ceramic works of this paper 1.07, 0.02, 0.45, in addition to the rest of the subjects for the ceramic works of this paper’s color matching preference scores are significantly higher than the A works. The series of ceramic works designed in this paper, “Days on the Clouds,” obtained an average color matching preference degree of 7.17, while the A works only 5.33. The former is higher than the latter at 1.84. On the whole, the series of ceramic works designed in this paper, “Days on the Clouds,” in the ceramics of the color matching to obtain more recognition, but also to obtain more flavor.

Figure 3.

Color matching preferences

Visual Aesthetics Score

In the previous section, a comparative analysis was conducted to assess the color matching aspect of each ceramic work. In this section, the overall visual aesthetics of the series of ceramic works designed in this paper, “Days on the Clouds” and the work of A as a comparison will be evaluated from an overall perspective, combined with the eye movement behavior data of the subjects analyzed above, and the visual aesthetics scoring also adopts a 10-point full-point system. The visual aesthetic scores of the ceramic works in this paper and the A works are shown in Figure 4. It can be seen that in the visual aesthetic scores of all 20 subjects, the scores of the ceramic works in this paper are basically higher than those of the A works. Only in the 18th and 19th respondents, there is a situation where the visual aesthetic ratings of the A works are 6.98 and 6.12 respectively, which are higher than the ratings of the ceramic works in this paper of 6.02 and 5.06. The highest visual aesthetics rating of ceramic works in this paper is 9.1, and the lowest score is 4.9. In contrast, the highest visual aesthetics rating of works A is no more than 7, and the lowest score is 1.38. In terms of visual aesthetics average, the average visual aesthetics rating of ceramic works in this paper is 6.8815, while works A is only 4.7835, which is lower than the ceramic works in this paper by 2.0962.

Figure 4.

Visual score

Overall, the series of ceramic works designed by applying the ceramic color quantification and matching evaluation method proposed in this paper, “Days on the Clouds”, outperforms the A works that do not use this paper’s method, both in terms of eye-movement behavioral indexes, as well as in terms of color matching preferences and visual aesthetics scores. The application of the ceramic color quantification and matching evaluation method proposed in this paper will provide useful assistance to the artistic creation of ceramics and the application of color.

Conclusion

This paper proposes a color quantification algorithm and a color matching measurement algorithm based on visual characteristics in order to achieve efficient application of color in ceramic art creation. This paper uses the color quantification algorithm and the color matching measurement algorithm, respectively, to confirm the effectiveness of the experiments. The k-means, DBSCAN, and Median Cut algorithms are selected for comparison, and the PSNR values of the color quantization algorithm in this paper are larger than those of the other algorithms in the color quantization reconstruction of six ceramic images. In SSIM value, except Photo2, the SSIM value of this paper’s color matching algorithm is higher than that of other comparative algorithms, and the average SSIM value reaches 0.812. The trend of this paper’s color matching algorithm is basically the same as that of the expert’s scoring results, with the highest difference value of 0.94<1, and the scoring method has validity.

Combining the color quantization algorithm and color matching measurement algorithm of this paper for ceramic art creation, a series of ceramic works, “Days on the Clouds,” is designed, and a series of ceramic works A, which is not designed by applying the method of this paper, is taken as a comparison to carry out practical analysis. In the eye movement behavior index data, the average gaze duration and average first gaze duration of ceramic works in this paper are 906.385 ms and 143.0875 ms higher than that of work A, respectively, and the average number of gaze points is 0.753 higher than that of work A. For the effect of color matching, the average color matching preference of ceramic works in this paper is 7.17, which is higher than that of A works 1.84 and is more favored by the subjects. In terms of the overall visual aesthetics of ceramic works, the average visual aesthetics of work A is 4.7835, and the ceramic works in this paper are higher than its 2.0962. The series of ceramic works, “Days on the Clouds,” created by the ceramic color quantification and collocation evaluation method proposed in this paper has shown obvious advantages in both color collocation and visual aesthetics.

Language:
English