Affine Transformation Based Ontology Sparse Vector Learning Algorithm
Online veröffentlicht: 06. Apr. 2017
Seitenbereich: 111 - 122
Eingereicht: 02. Jan. 2017
Akzeptiert: 06. Apr. 2017
DOI: https://doi.org/10.21042/AMNS.2017.1.00009
Schlüsselwörter
© Linli Zhu, Yu Pan, Jiangtao Wang, published by Sciendo
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.
As a kind of information representation and shared model, ontology is introduced in nearly all fields of computer science. Acting as a concept semantic framework, ontology works high effectiveness and is widely employed in other engineering applications such as biology science, medical science, pharmaceutical science, material science, mechanical science and chemical science (for instance, see Coronnello et al. [2], Vishnu et al. [3], Roantree et al. [4], Kim and Park [5], Hinkelmann et al. [6], Pesaranghader et al. [7], Daly et al. [8], Agapito et al. [9], Umadevi et al. [10] and Cohen [11]).
The model of ontology can be regarded as a graph
There are several effective learning tricks in ontology similarity measure and ontology mapping. Gao and Zhu [12] studied the gradient learning algorithms for ontology similarity computing and ontology mapping. Gao and Xu [13] obtained the stability analysis for ontology learning algorithms. Gao et al. [14] manifested an ontology sparse vector learning approach for ontology similarity measuring and ontology mapping based on ADAL trick. Gao et al. [15] researched an ontology optimization tactics according to distance calculating techniques. More theoretical analysis of ontology learning algorithm can be referred to Gao et al. [16].
In this paper, we propose a new ontology learning trick based on affine transformation. Furthermore, we present the efficiency of the algorithm in the biology and chemical applications via experiments.
Let
In recent years, the application of ontology algorithm faces many challenges. When it comes to the field of chemical and biology, the situation may become very complex since we need to deal with high dimensional data or big data. Under this background, sparse vector learning algorithms are introduced in biology and chemical ontology computation (see Afzali et al. [17], Khormuji, and Bazrafkan [18], Ciaramella and Borzi [19], Lorincz et al. [20], Saadat et al. [21], Yamamoto et al. [22], Lorintiu et al. [23], Mesnil and Ruzzene [24], Gopi et al. [25], and Dowell and Pinson [26] for more details). For example, if we aim to find what kind of genes causes a certain genetic disease, there are millions of genes in human’s bodies and the computation task is complex and tough. However, in fact, only a few classes of genes cause this kind of genetic disease. The sparse vector learning algorithm can effectively help scientists pinpoint genes in the mass disease genes.
One computational method of ontology function via sparse vector is expressed by
where
For example, the standard framework with the penalize term via the
where λ > 0 is a balance parameter and
Let
One general ontology sparse vector learning framework can be stated as
where
An effective method to get the solution is to set variables
By derivation, the Lagrange dual problem of above ontology problem is formulated as
The other version of ontology framework can be simply expressed as
And, the ontology dual problem of (6) can be written as
for any
for any
Set
Next, we present our dual framework of ontology problem which can be formulated as a problem of projection. Set
then the ontology problem (5) becomes
Set
Set
Furthermore, we infer
In terms of (10) and (11), we get the following equal ontology optimization version:
Now, we discuss the deep ontology optimization model in view of affine transformation. Set Θ1 ∈ ℝ
and
In addition, the ontology problem (14) has the equal solution with the following ontology optimization problem
Set
It is not hard to check that the dual optimal solution of ontology problem is the projection of
In the following contexts, we show the equivalent optimization model of our ontology sparse vector problem. Our discussion can be divided into two cases according to whether the value of
If
If
Therefore, our ontology problem (15) can be expressed as
Note that the second term in (18) doesn’t rely on
Let
It implies that the dual optimal solution of ontology problem is the projection of
The feasible set of ontology problem (21) is stated as
In this section, we test the feasibility of our new algorithm via the following four simulation experiments related to ontology similarity measure and ontology mapping below. After obtaining the sparse vector
In biology science, “GO” ontology (denoted by
The Structure of “GO” Ontology
The experiment results of ontology similarity measureAlgorithm in our paper 0.4762 0.5504 0.6731 0.7918 Algorithm in Huang et al. [29] 0.4638 0.5348 0.6234 0.7459 Algorithm in Gao and Liang [30] 0.4356 0.4938 0.5647 0.7194 Algorithm in Gao et al. [16] 0.4213 0.5183 0.6019 0.7239
From Fig. 1, take
Physical ontologies
“Physical” ontology
“Physical” ontology
The experiment results of ontology mappingAlgorithm in our paper 0.6913 0.7742 0.9161 Algorithm in Huang et al. [29] 0.6129 0.7312 0.7935 Algorithm in Gao and Liang [30] 0.6913 0.7556 0.8452 Algorithm in Gao et al. [31] 0.6774 0.7742 0.8968
It can be seen that our algorithm is more efficient than ontology learning algorithms raised in Huang et al. [29], Gao and Liang [30] and Gao et al. [31] in particular when N is sufficiently large.
In this part, “PO” ontology
The experiment results of ontology similarity measureAlgorithm in our paper 0.4865 0.6052 0.7393 Algorithm in Wang et al. [28] 0.4549 0.5117 0.5859 Algorithm in Huang et al. [29] 0.4282 0.4849 0.5632 Algorithm in Gao and Liang [30] 0.4831 0.5635 0.6871
It’s revealed in the Tab. 3 that the precision ratio in view of our ontology sparse vector learning algorithm is higher than the precision ratio proposed by ontology learning algorithms that Wang et al. [28], Huang et al. [29] and Gao and Liang [30] when
The Structure of “PO” Ontology.
Humanoid robotics ontologies (denoted by
“Humanoid Robotics”ontology
“Humanoid Robotics “ ontology
The experiment results of ontology mappingAlgorithm in our paper 0.2778 0.5000 0.6556 Algorithm in Gao and Lan [32] 0.2778 0.4815 0.5444 Algorithm in Gao and Liang [30] 0.2222 0.4074 0.4889 Algorithm in Gao et al. [31] 0.2778 0.4630 0.5333
The experiment results presented in Table 4 imply that our ontology sparse vector learning algorithm works with more efficiency than other ontology learning algorithms obtained in Gao and Lan [32], Gao and Liang [30] and Gao et al. [31] especially when N is sufficiently large.
In our paper, an affine transformation based computation technology is considered and presented to the readers. This ontology technology is suitable for biological and chemical ontology engineering applications because of its similarity measure and ontology mapping. The main approach is based on affine transformation and its theoretical derivation. At last, simulation data show that our ontology scheming has high efficiency in biology, physics, plant and humanoid robotics fields. The ontology sparse vector learning algorithm raised in our paper illustrates the promising application prospects in multiple disciplines.
The authors declare that there is no conflict of interests regarding the publication of this paper.