Study of Fuzzy Distance Measure and Its Application to Medical Diagnosis

Ambiguity has an important part in the contrary observations around peripheral world. Entropy is imperative for measuring uncertain information which was first introduced by Shannon (1948) to measure the uncertain degree of randomness in a probability distribution. Fuzzy information measures have been applied widely in the area of decision making. Jensen–Shannon divergence is a useful distance measure in the probability distribution space. The present communication we propose a way of measuring the difference between two fuzzy sets by means of a function, called divergence. In addition, study of their detailed properties for its validity is also discussed. The applications of these newly developed fuzzy divergence measure have been provided to the optimal decision making based on the weights of alternatives. Numerical verification has been illustrated to demonstrate the proposed method for solving optimal decision-making problem under fuzzy environment.


Introduction
Information theory advanced out of mathematical studies of the problems linked with communication, storage, and transmission of massages. It originated from the fundamental paper "The Mathematical Theory of Communication" published by Shannon [1]. Shannon developed mathematical schemes for quantitatively defining the ideas of facts and proved several very trendy outcomes with deeper effects. Various generalizations of Shannon entropy were studied by Renyi [2], Arimoto [3], Sharma and Taneja [4], De Luca and Termini [5], Kaufmann [6] and Peerzada et al. [7]. Uncertainty and fuzziness are the primary nature of human wondering and of many real-world objectives. Fuzziness is found in our decision, in our language and inside the way we process information. The fundamental use of information is to get rid of uncertainty and fuzziness.
In reality, we degree data furnished by using the quantity of probabilistic uncertainty eliminated in an experiment and the measure of uncertainty eliminated is also called as a measure of information while degree of fuzziness is the measure of vagueness and ambiguity of uncertainties. The theory of fuzzy sets (FSs) developed by Zadeh [8], as a generalization of classical set theory, for representing vague and indistinct phenomena. This idea serves as an effective tool for know-how of the behaviour of humanistic systems in which human judgment, perceptions and feelings play a critical role. In fuzzy set concept, the entropy is described as a degree of fuzziness which expresses the quantity of ambiguity or problem in we decide whether an element belongs to a set or not. Bhandari and Pal [9] extended the probabilistic exponential entropy idea of Pal and Pal [10] to the fuzzy phenomenon. Kapur [11] discussed fuzzy measures uncertainty due to fuzziness of information.
In fuzzy context, several measures have been proposed to measure the degree of difference between two fuzzy sets. Measure of fuzzy divergence between two fuzzy sets gives the difference between two fuzzy sets and this measure of difference between two fuzzy sets is called the fuzzy divergence measure.
The similarity measure is important tools that can be used in decision-making problem to deal with uncertainty through IFS theory. Various distance measures have been proposed by different researchers. It has been observed that different distance measure produces different values while measuring the distance degree between two IFSs. Also, sometimes existing distance measures are not able to give an appropriate and convenient result for a pair of IFSs. For this reason, it is always necessary to derive advanced measures for better decision making.
To explain the distinction among fuzzy sets, the distance measure was set up and was regarded as dual model of correspondence measure. Many researchers, such as Yager [12], Kosko [13] and Kaufmann [6] had used distance measure to define fuzzy entropy. Several recent methods of fuzzy entropy generated by distance measure and properties of distance measure were extended by Fan et al. [14]. The distances among two fuzzy subsets on a fuzzy subset of R+ were characterized by Dubois and Prade [15]. Thus, the set of distances between two sets was simplified whereas the shortest distance between two crisp sets was not simplified. The shortest distance among two fuzzy sets as a density function on non-negative reals was described by Rosenfeld [16]. Thus, related to Kullback and Leibler [17] probabilistic measure of divergence, the subsequent measure of fuzzy directed divergence was initiated by Bhandari and Pal [9]. Montes et al. [18] proposed an axiomatic form to measure the difference between fuzzy sets and we study in detail the case of local divergence.
Luo and Zhao [19] gave the algorithms for pattern recognition and use it to solve medical diagnosis problems. Gupta and Tiwari [20] and Datta and Goala [21] proposed cosine similarity measure for intuitionistic and interval-valued intuitionistic fuzzy sets using an advanced distance measure on intuitionistic fuzzy sets.

Preliminaries
The model of entropy was initiated to arrange numerical quantity of ambiguity.
for the uncertainty of a probability distribution ( 1 , 2 , 3 … ) and called it entropy. A fuzzy set ̃ in a finite Universe of discourse X = ( 1 , 2 , 3 … ) is given by De Luca and Termini [5] defined fuzzy entropy for a fuzzy set A corresponding to Shannon Entropy (1948) as Motivated by the fundamental properties of directed divergence, Kapur [11] explained the concept of fuzzy directed divergence as follows: The directed divergence of fuzzy set A from the fuzzy set B is a function D (A; B) that should comply with the subsequent requirements which satisfies the following conditions: Now, corresponding to Kullback -Leibler's [17] measure of divergence, Bhandari and Pal [9] proposed a fuzzy divergence measure A and B given by Later, Shang and Jiang [22] was pointed out that the expression (4) has some limitations, i.e., if ( ) approaches to 0 or 1, then its value tends to ∞. Therefore they proposed a modified version of fuzzy divergence measure (4), given as Corresponding to Kerridge [23] inaccuracy measure, Verma and Shrama [24] define a measure of inaccuracy of fuzzy set B with respect to fuzzy set A, as Ohlan [25] proposed a parametric generalized measure of divergence between two fuzzy sets A and B corresponding to Taneja [26] as = 0,1,2, … The generalized measure of fuzzy directed divergence of order and type is given by Arora and Dhiman [27] as where > 0, ≠ 1, ≠ 0.
Prakash and Kumar [28] proposed a new fuzzy divergence measure of fuzzy set B with respect to fuzzy set A, as follows: Kumari et al. [29] proposed Weighted Fuzzy Exponential J-Divergence as where is the membership values of the pixels in the image and is the (i, j) th pixel of the image A. Tiwari and Gupta [30] proposed entropy measures and erived relation between distance, entropy, and similarity measures for IvIFSs.
Theorem 3.1. The fuzzy distance measure ( ; ) defined in equation (11) is a valid measure of fuzzy divergence.
Proof. All the necessary four conditions to be a distance measure are satisfied by the new distance measure which are as follows: ( From (12) and (13), we have   (table 1 and table 2) serve the purpose of the proposed computational application: In view of the table 3, it is being concluded that " " is suffering from " "; " ℎ ", " " and " " are suffering from " ℎ " and " ℎ " is suffering from " ".
This is because smaller value of the patient against each distance measure indicates the more probability of having the disease.

Comparative Study
Jain and Kumar [31] proposed the intuitionistic fuzzy based trigonometric entropy as: The fuzzy version of the entropy is: From the table, it is concluded that the larger value in the column is the decision value.
Wei et al. [32] proposed the generalized fuzzy entropy as: From the table, it is concluded that the smaller value in the column is the decision value.

Conclusions
In this paper, we have proposed a relative distance measure for fuzzy sets. Proof of its validity is also considered through numerical computations. Some of the essential properties of the measure are also studied. It has been observed that this measure is more flexible in terms of their previous derived measures. Application of this measure is also studied in medical diagnosis to check its legitimacy. Also, from the table 4 and 5, it is concluded that the result obtained from the proposed entropy is similar with the results of the existing entropies (shown in table nos.), which validates the fact that the proposed entropy is valid and have applications across disciplines.