**Computer aided diagnosis of alzheimers disease health and social care essay**just for you with a

**15% discount**for the 1st order

- Published: September 7, 2022
- Updated: September 7, 2022
- Language: English
- Downloads: 38

Based on NMF-Decision Tree ClassifierNancy Noella R. SDepartment Of Computer Science and Engineering, Marthandam College of Engineering and Technology, Anna University, Chennai – 97, Tamil Nadu, India

## Abstract— This paper presents a novel Computer-Aided Diagnosis (CAD) technique for the early diagnosis of the Alzheimer’s disease (AD) based on Nonnegative Matrix Factorization (NMF) based Decision Tree Classifier (DTC). In general, decision trees represent a disjunction of conjunctions of constraints on the attribute-values of instances. Each path from the tree root to a leaf corresponds to a conjunction of attribute tests and the tree itself to a disjunction of these conjunctions. The CAD tool is designed for the study and classiﬁcation of functional brain images. For this purpose, two different brain image databases are selected: a single photon emission computed tomography (SPECT) database and positron emission tomography (PET) images, both of them containing data for both Alzheimer’s disease patients and healthy controls as a reference. These databases are analyzed by applying the Nonnegative Matrix Factorization and wavelet transform (WT) for feature selection and extraction of the most relevant features. The resulting Wavelet-transformed sets of data, which contain a reduced number of features, are classiﬁed by means of a decision tree classiﬁer with bounds of conﬁdence for decision.

## Index Terms— Alzheimer’s Disease (AD), Nonnegative Matrix Factorization (NMF), Positron Emission Tomography (PET), Single Photon Emission Computed Tomography (SPECT), Decision Tree classifier (DTC), Wavelet Transform (WT)

## I. INTRODUCTION

ALZHEIMER’s disease (AD) is the most common cause of dementia in aged people and affects more than 30 million individuals worldwide. The particular evolution of AD patients and their increasing dependence on the close affective environment provokes an important social repercussion, as the cognitive functions of the patient gradually disappear and his individual essence blurs [1]. The effects of this disease are of great importance not only in terms of familiar dependence and affliction but also economic: with the growth of the older population in developed nations, the prevalence of AD is expected to triple over the next 50 years. Functional imaging modalities are often used with the aim of achieving early diagnosis, although this early diagnosis remains as a demanding task, usually based on the information provided by a careful clinical examination carried out by experts. Emission computed tomography images have been widely employed in biomedical research and clinical medicine during the last decade. These emission-based functional images reproduce a map of physiological functions along with anatomical structure images, providing information about physiological phenomena and their location in the body. In this work, two different modalities are used for brain image acquisition: positron emission tomography (PET) and single photon emission computed tomography (SPECT). Both techniques are noninvasive, nuclear medicine imaging techniques which produce a three-dimensional image of functional processes in the body, such as blood perfusion or glucose metabolism, by means of emitting radio nuclides (tracers) [2], [3]. In both techniques, PET and SPECT, all these detected emissions are processed and a three-dimensional image of the region under study is obtained, in this case the brain, by means of subsequent computer analysis and back-projection Fourier algorithms [2].

## A. Computer Aided Diagnosis (CAD) Techniques

For the past several decades, researchers in the medical imaging field have focused on bringing new imaging modalities to clinicians while improving the performance of existing systems [4], [5]. Nowadays, signal processing engineers are beginning to take the next step by introducing software improvements, enabling computers to help clinicians to make sense of such an amount of noninvasive medical information. Computer aided diagnosis (CAD) is a general term used for a variety of techniques applied to whatever kind of medical data, such as to medical images, to assist physicians in their diagnosis work. CAD systems help physicians by either identifying patterns that might have been overlooked or by providing a road map of suspicious areas, making their efforts more efficient. Several approaches for designing CAD systems for the Alzheimer’s disease diagnosis can be found in the literature [6]–[9]. The first approach is the statistical parametric mapping (SPM) [10] tool, widely used in neuroscience, and its numerous variants. It was not developed specifically to study a single image, but for comparing groups of images. SPM is designed as a univariate approach since the classical multivariate techniques require the number of available observations (i. e., images) to be greater than the number of components (i. e., voxels) of the multivariate observation. The second approach is based on the analysis of the images, the analysis of regions of interest (ROI), feature extraction and posterior classification in different classes by means of some discriminative functions. This multivariate approach faces up the well-known small sample size problem, that is, the number of available samples is much lower than the number of features used in the training step. In this case, dimensionality reduction and feature selection is an issue of great concern. As referred in [11] and [12], statistical learning classification methods and CAD tools have not been explored in depth for dementia, quite possibly due to the fact that images represent large amounts of data and most imaging studies have relatively few subjects (typically ) [11], [13], [14]. A supervised learning-based CAD system applied to functional imaging consists of several important stages: 1) Functional image acquisition and normalization; 2) feature selection and extraction; 3) Classification, with a train and test strategy. The design and proper validation of a CAD tool for Alzheimer’s disease diagnosis, along with the adequate description of its forming techniques for feature selection and extraction and for posterior classification, is the main issue in this work. Sections II and III provide the main techniques applied in the CAD tool provided in this work. The first one is devoted to the reduction and rearrangement of the data by means of nonnegative matrix factorization (NMF), meanwhile the second is devoted to the proper definition of a Decision Tree Classifier (DTC) based classifier with bounds of confidence. Finally, conclusions are drawn in Section IV.

## II. FEATURE SELECTION AND REDUCTION

Each voxel of a brain functional 3-D image contains information of the corresponding brain point. However, not all the voxels have the same level of relevance in terms of discrimination between groups of subjects. In this case, two groups of subjects are defined: Alzheimer’s disease patients, labeled as AD, and subjects not affected by this disease, labeled as NOR. Thus, an initial feature selection based on discrimination capability is typically selected [13]–[14], obtaining a vector of discriminant voxels for each participant. In addition, the selected discriminant voxel vectors can be projected onto a different subspace. This subspace is chosen so that only a few variables represent the most discriminant features of each patient images in each database. These steps are conveniently described below.

## A. Intensity Normalization

Previous to any kind of feature selection, the data sets have to be normalized in intensity in order to be able to compare images according to their voxel normalized intensity levels. Regarding the intensity normalization, the normalization to the maximum intensity level may introduce problems in some images that can have peak intensity values due to noise. Thus, these images are badly normalized as the normalization is based on wrong noisy voxels. In this work, in order to avoid these possible normalization errors, it is applied intensity normalization based on the mean value of a group of voxels with the highest intensity values. According to [15], the mean value of the 0. 1% voxels with the highest intensity levels is selected for the intensity normalization.

## B. Wavelet Transform for Feature Selection

A wavelet series is a representation of a square-integrable (real- or complex-valued) function by a certain orthonormal series generated by a wavelet. This article provides a formal, mathematical definition of an orthonormal wavelet and of the integral wavelet transform. The integral wavelet transform is the integral transform defined asleft[W_psi f

ight](a, b) = frac{1}{sqrt{| a|}}int_{-infty}^infty overline{psileft(frac{x-b}{a}

ight)}f(x)dx, The wavelet coefficients Cjk are then given byc_{jk}= left[W_psi f

ight](2^{-j}, k2^{-j})Here, a= 2^{-j}is called the binary dilation or dyadic dilation, and b= k2^{-j}is the binary or dyadic position.

## C. Nonnegative Matrix Factorization for Feature

## Reduction

Nonnegative matrix factorization (NMF) is a technique for finding parts-based, linear representations of nonnegative data [17], [18], being a useful decomposition tool for multivariate data. This technique is especially suitable for nonnegative data sets such as functional images in general, and for the PET and SPECT brain images of this work in particular, where all the variables consist of positive values. Given a nonnegative data matrix A, NMF finds an approximate factorization A ≈ WH into nonnegative matrices W and H. The nonnegativity condition forces the representation to be purely additive, in contrast to other existing representations such as principal component analysis (PCA), kernel PCA, etc. PCA or NMF can all be seen as matrix factorization, with different choices of objective function and/or constraints. NMF has been widely applied in the field of image processing. In the literature, it can be found a variety of works based on NMF, for image processing in general (i. e., face recognition) and in medical image processing and analysis in particular: brain image analysis, dynamic myocardial image analysis, etc. Many works are available related to NMF image transformation for image analysis, NMF algorithms [17], kernel techniques applied to NMF. Formally, nonnegative matrix factorization is a linear, nonnegative approximate data representation where the original database A = [A1, A2, …., AM] (N by M elements), which consists of M measurements (profiles) of N nonnegative scalar variables, is approximated by a nonnegative matrix product, as given inA ≈ WH (2)where the matrix W = [W1, W2, ….., WK] has N x K dimension , and the matrix H = [H1, H2, ….., HM] has dimension K x M. Thus, each element of matrix is decomposed, as shown in(3)An appropriate decision on the value K of is critical in practice, but the choice of K is very often problem dependent. In most cases, however, K is usually chosen such that K << min (M, N) in which case WH can be seen as a compressed form of the data in A. This property yields a reduced-variable matrix H that represents A in terms of the NMF basis W. After NMF factorization, the data contained in H (K by M elements) can be considered a transformed database with lower rank (K), than the original database A. Thus, a few variables are representing the data of each profile in the new representation. The relative error (%) of the factorization can be computed by means of the comparison of matrix A and the approximation WH. The minimum number of vectors (K) in the NMF basis is selected so that a predefined level of relative error is not exceeded. 1) Factorization Rule: Given the data matrix A, the optimal choice of matrices W and H are defined to be those nonnegative matrices that minimize the reconstruction error between A and WH. A variety of error functions (Err) have been proposed [17], [18], some of the most useful are given below, in (4) and(5), and applied in this work [17]where (4) is known as the Frobenius norm (reduction of the Euclidean distance), and (5) as the Kullback–Leibler divergence, among others. The NMF process is, thus, translated into an optimization problem, subject to minimization of Err, according to the one chosen. Some NMF algorithms are proposed in [16]. There are different approaches for these algorithms [17]: with multiplicative update rule, with additive update rule or alternating least squares algorithms (ALS). Due to their fast convergence and lower iteration requirements, the last one is selected for NMF in this work. Although there are a variety of nonlinear techniques for feature reduction, in this work only the linear case is considered and NMF is selected, due to the simplicity of the proposed factorization and the preservation of a linear relation between the original space of features and the new one. Fig. 1. NMF projection for the same transaxial slice of all the SPECT database subjects. (a) NMF eigenvectors for k = 1, 2, 3. (b) One example subject slice. (c) Its NMF reconstruction. Note the special relevance of brain regions in the eigenvectors. For the sake of clarity, Fig. 1 provides the first three vectors of the NMF basis W, in the form of 2 -D images, derived from one of the data sets used in this work (SPECT database, see Section IV below), along with one particular transaxial slice and the one obtained from the NMF projection. The transaxial slices provided in Fig. 1 are oriented from posterior (top) to anterior (bottom). In the rest of the document the same orientation criterion is followed. For all the data sets used in this work, a relative error lower than 5% is guaranteed in all the NMF transformations.

## III. DECISION TREE CLASSIFIER (DTC)

## WITH BOUNDS OF CONFIDENCE

In DTC denote the feature space by X. The input vector X £ X contains p features X1, X2, …, Xp, some of which may be categorical. Tree structured classifiers are constructed by repeated splits of subsets of X into two descendant subsets, beginning with X itself. The union of the regions occupied by two child nodes is the region occupied by their parent node. Every leaf node is assigned with a class. A query is associated with class of the leaf node it lands in. A node is denoted by t. Its left child node is denoted by tL and right by tR. The collection of all the nodes is denoted by T; and the collection of all the leaf nodes by ˜T. A split is denoted by s. The set of splits is denoted by S. Fig. 2. Represent the spilling of trees considering the values. The construction of a tree involves the following three elements: 1. The selection of the splits. 2. The decisions when to declare a nodeterminal or to continue splitting it. 3. The assignment of each terminal node to aclass. An impurity function is a function ϕ defined on the set of all K-tuples of numbers (p1, …, pK) satisfying pj ≥ 0, j = 1, …, K, ∑j pj = 1 with the properties: 1. ϕ is a maximum only at the point ( 1K, 1K, …, 1K). 2. ϕ achieves its minimum only at the points (1, 0, …, 0), (0, 1, 0, …, 0), …, (0, 0, …, 0, 1). 3. ϕ is a symmetric function of p1, …, pK, i. e., if youpermute pj , ϕ remains constant. The given impurity function ϕ, define the impurity measure i(t) of a node t as i(t) = ϕ (p(1 | t), p(2 | t), …, p(K | t)) , where p(j | t) is the estimated probability of class j within node t. Goodness of a split s for node t, denoted by ϕ (s, t), is definedBy ϕ(s, t) = ∆i(s, t) = i(t) − pRi(tR) − pLi(tL) , where pR and pL are the proportions of the samples in node t that go to the right node tR and the left node tL respectively. Define l(t) = i(t)p(t), that is, the impurity function of node t weighted by the estimated proportion of data that go to node t. The impurity of tree T, l(T) is defined by(6)Note for any node t the following equations hold: p(tL) + p(tR) = p(t)pL = p(tL)/p(t), pR = p(tR)/p(t)pL + pR = 1 (7)Define,(8)Other strategies based on the combination of classifiers may be considered but the one provided above is taken as the reference in this work.

## IV. CONCLUSION

This project presents a novel computer-aided diagnosis (CAD) technique for the early diagnosis of the Alzheimer’s disease (AD) based on Negative matrix factorization based decision tree classifier. In general, decision trees represent a disjunction of conjunctions of constraints on the attribute-values of instances. Each path from the tree root to a leaf corresponds to a conjunction of attribute tests and the tree itself to a disjunction of these conjunctions. The CAD tool is designed for the study and classiﬁcation of functional brain images. For this purpose, two different brain image databases are selected: a single photon emission computed tomography (SPECT) database and positron emission tomography (PET) images, both of them containing data for both Alzheimer’s disease (AD) patients and healthy controls as a reference. These databases are analyzed by applying the negative matrix factorization (NMF) and wavelet transform (WT) for feature selection and extraction of the most relevant features. The resulting Wavelet-transformed sets of data, which contain a reduced number of features, are classiﬁed by means of a decision tree classiﬁer with bounds of conﬁdence for decision.

Your fellow student wrote and submitted this work, "Computer aided diagnosis of alzheimers disease health and social care essay". This sample can be used for research and reference in order to help you write your own paper. It is prohibited to utilize any part of the work without a valid citation.

If you own this paper and don't want it to be published on EduFrogs.com, you can ask for it to be taken down.

Ask for Removal## Cite this Essay

References

*EduFrogs*. (2022) 'Computer aided diagnosis of alzheimers disease health and social care essay'. 7 September.

Reference

EduFrogs. (2022, September 7). *Computer aided diagnosis of alzheimers disease health and social care essay.* Retrieved from https://edufrogs.com/computer-aided-diagnosis-of-alzheimers-disease-health-and-social-care-essay/

References

*EduFrogs*. 2022. "Computer aided diagnosis of alzheimers disease health and social care essay." September 7, 2022. https://edufrogs.com/computer-aided-diagnosis-of-alzheimers-disease-health-and-social-care-essay/.

1. *EduFrogs*. "Computer aided diagnosis of alzheimers disease health and social care essay." September 7, 2022. https://edufrogs.com/computer-aided-diagnosis-of-alzheimers-disease-health-and-social-care-essay/.

Bibliography

*EduFrogs*. "Computer aided diagnosis of alzheimers disease health and social care essay." September 7, 2022. https://edufrogs.com/computer-aided-diagnosis-of-alzheimers-disease-health-and-social-care-essay/.

Work Cited

"Computer aided diagnosis of alzheimers disease health and social care essay." *EduFrogs*, 7 Sept. 2022, edufrogs.com/computer-aided-diagnosis-of-alzheimers-disease-health-and-social-care-essay/.

## Get in Touch with Us

If you have ideas on how to improve Computer aided diagnosis of alzheimers disease health and social care essay, feel free to contact our team. Use the following email to reach to us: [email protected]