SOFT COMPUTING APLICATION


Preface
The digital revolution and the explosive growth of the Internet have helped create collection of huge amounts of useful data of diverse characteristics. Data is a valuable and intangible asset in any business today. Databases and database technologies play a crucial role in maintaining and manipulating data. Various database models exist such as relational, hierarchical, network, flat, and object-oriented. Each model organizes data in a different way to make them suitable for the intended application.
Real world data are diverse and imprecise in nature. They are growing at a phenomenal rate. As the application needs are diverse they demand a completely different set of requirements on the underlying models. The conventional relational database model is no longer appropriate for heterogeneous data. The diverse characteristics of data and its huge volume demand new ways of carrying out data analysis. Soft computing is a new, emerging complementary discipline for traditional computing principles. It exploits the tolerance for imprecision and uncertainty to achieve solutions for complex problems.
Soft computing methodologies include fuzzy sets, neural networks, genetic algorithms, Bayesian belief networks and rough sets. Fuzzy sets provide a natural framework for dealing with uncertainty. Bayesian belief networks, neural networks, and rough sets are widely used for classification and rule generation. Genetic algorithms handle various optimization and search processes like query optimization and template selection. Rough sets handle uncertainty arising from granules in the domain of discourse.
The advent of soft computing marks a significant paradigm shift in computing. Currently it has a wide range of application. The various techniques of soft computing and their applications in database technologies are discussed in the chapters of this book.
The first chapter, titled “Fuzzy Database Modeling: An Overview and New Definitions,” by Angélica Urrutia and José Galindo gives an overview of fuzzy database modeling. Much of data in real world is not precise, but fuzzy. Zadeh’s fuzzy logic gives a tool to handle fuzzy data in decision making. Modeling of fuzzy data has been studied by a number of researchers. The authors of this chapter discuss further extensions in the field of fuzzy database modelling. This chapter starts with a review of contributions of previous authors with respect to fuzzy database modeling, particularly the FuzzyEER model. The chapter then proceeds to introduce and explain their proposed newly definitions pertaining to the FuzzyEER Model regarding fuzzy attributes, fuzzy degrees, and fuzzy entities. The newly introduced concepts are amply illustrated through suitable examples. The authors hope that the new definitions will further enhance FuzzyEER model to facilitate fuzzy queries and fuzzy data mining.
The author Pierre Collet, in the chapter “A quick presentation of Evolutionary Computation,” gives an easy-to-grasp exposition of generic evolutionary computation paradigm. After giving a brief historical perspective, the author presents a unified evolutionary algorithm. The various concepts involved are lucidly explained with examples. This chapter will be quite useful to get a gentle introduction and survey of generic evolutionary computation.
The third chapter, “Evolutionary Algorithms in Supervision of Error-free Control,” by Bohumil Sulc and David Klimanek, reports application of certain soft computing techniques in combustion control. Specifically, genetic and simulated annealing algorithms have been employed in a model-based controlled variable sensor discredibility detection. The authors outline procedure of incorporating genetic and simulated annealing algorithms in the control loop. They claim that such application of these soft computing techniques has a great importance in industrial practice because a timely predicted sensor malfunction helps to save additional costs resulting from unplanned shutdowns.
In the next chapter titled “Soft Computing Techniques in Spatial Databases,” its author Markus Schneider explains how two different soft computing techniques with different expressiveness can be used for spatial data handling in the context of spatial databases and Geographic Information Systems. The focus of this chapter is the design of the algebra systems named Vague Spatial Algebra (VASA) and Fuzzy Spatial Algebra (FUSA). A formal definition of the structure and semantics of both types of systems are also provided. Further, spatial set operations for both the algebras have also been discussed. Finally, a description of how these data types can be embedded into extensible databases is explained with sample queries.
In the fifth chapter, “Type-2 Fuzzy Interface for Artificial Neural Network”, the author Priti Srinivas Sajja introduces another hybrid soft computing technique. The field of applications of soft computing technique discussed by the author is the process of course selection performed by students. The author introduces a generic framework of type-2 fuzzy interface to an Artificial Neural Network system. The author covers the introduction of fuzzy logic, fuzzy membership functions, type-1 and type-2 fuzzy systems and Artificial Neural Networks (ANN) for novice readers. Also, the author gives the need for hybridization of ANN and fuzzy logic. Next, the author illustrates an experimental prototype with fuzzy interface and base ANN. In this system, the author uses type-2 fuzzy interface to feed input to the base ANN. The author claims that with sufficient amount of good quality input data, the system performs well and with minor modification, the system may be used for HR Management, aptitude testing and general career counseling.
In chapter VI, “A Combined GA-Fuzzy Classification System for Mining Gene Expression Databases,” the authors Gerald Schaefer and Tomoharu Nakashima introduce yet another hybrid soft computing technique. The field of application of soft computing technique discussed by the authors is gene expression database. After explaining fuzzy rule generation and fuzzy rule classification, the authors point out the very many fuzzy if-then rules that would result. The number of generated rules increases exponentially with the number of attributes involved and with the number of partitions used for each attribute. The genetic algorithm technique is employed to reduce the number of rules to a compact set. The authors discuss genetic operations employed and give their algorithm in detail and ways of improving its performance. The authors demonstrate their hybrid technique on three gene expression data sets that are commonly used in the literature, viz., Colon dataset, Leukemia dataset and Lymphoma dataset. Exhaustive simulation results are given. The authors state that their technique yields good results.
The chapter on “Fuzzy Decision Rule Construction Using Fuzzy Decision Trees (FDT): Application to E-Learning Database” by Malcolm J. Beynon and Paul Jones (Chapter VII) discusses an application of fuzzy logic–specifically fuzzy decision tree. The authors give two sets of extensive FDT analysis. The first analysis deals with a small example dataset to illustrate the concepts. The second FDT analysis is in the field of E-Learning and considers the student’s weekly online activities and subsequent performance in a university course. The authors emphasize visualization of results throughout the chapter.
Chapter VIII, “A Bayesian Belief Network Methodology for Modeling Social Systems in Virtual Communities: Opportunities for Database Technologies,” is contributed by the authors Ben K. Daniel, Juan-Diego Zapata-Rivera, and Gordon I.Mc Calla. Bayesian belief networks are used to model situations involving uncertainty arising in fields like social sciences. A Bayesian model encodes domain knowledge, showing relationships, interdependencies, and independencies among variables. However, knowledge engineering effort is required to create conditional probability for each variable in the network. This chapter describes an approach that combines both qualitative and quantitative techniques to elicit knowledge from experts without worrying about computing initial probabilities for training the model. The authors demonstrate their technique on a computational model of social capital in virtual communities.
The importance of preserving the accuracy and integrity of data in a database is highlighted in the chapter titled “Integrity Constraints Checking in a Distributed Database” by Hamidah Ibrahim. This chapter discusses checking integrity constraints in distributed databases. The author describes different integrity constraint tests that could be conducted in distributed databases. A review of the integrity constraints available in the literature is clearly given. Finally the author explains several strategies for checking the integrity constraints in distributed databases. The author also discusses important criteria for evaluating the integrity tests.
Authors G. Castellano, A. M. Fanelli, and M. A. Torsello in chapter X, “Soft Computing techniques in Content-Based Multimedia Information Retrieval,” clearly explain the basic concepts of the four techniques coming under the purview of softcomputing, viz., Fuzzy Logic, Neural Networks, Rough Sets, and Genetic Algorithm. They give a good literature survey on the application of each soft computing technique to Content Based Multimedia Information Retrieval(CB-MIR). They also give a good survey of the applicaiton of hybrid, neuro-fuzzy technique to CB-MIR. Next, the authors discuss in detail about applying hybrid neuro-fuzzy techniques to CB-MIR. They have contributed a system called VIRMA(Visual Image Retrieval by Shape MAtching) that enables users to search for images having a shape similar to the sketch of a submitted sample image. The neuro-fuzzy strategy enables one to extract a set of fuzzy rules that classify image pixels for the extraction of contours included in the processed image, so that this can be stored in the database. The authors show how the neuro-fuzzy technique is useful for CB-MIR.
Chapter XI, “An Exposition of Feature Selection and Variable Precision Rough Set Analysis: Application to Financial Data,” by Malcolm J. Beynon and Benjamin Griffiths presents a Variable Precision Rough Sets (VPRS) analysis of certain Fitch Individual Bank Rating (FIBR) datasets. There are two parts of elucidation undertaken in this chapter. First, the levels of possible pre-processing necessary for undertaking a Rough Set based Theory (RST) analysis and then the presentation of an analysis using VPRS. The vein graph software enables one to select a single β-reduct and derive the rules associated with the β-reduct. Two algorithms are used for feature selection namely ReliefF and RST_FS. The predictions based on the training and validation sets are displayed in the ‘Predictive Summary Stat’s panel.
Xenia Nadenova introduces the concept of Good Diagnostic Test (GDT) as the basis of her approach in “Interconnection of Class of Machine Learning Algorithms with Logical Commonsense Reasoning Operations” in chapter XII. This chapter explains the possibility of transforming a large class of machine learning algorithms into a commonsense reasoning processes by using well-known deduction and induction logical rules. The lattice theory is used for constructing a good classification test. The rules for implementing variant transitions have been constructed such as rules of generalization and specialization, inductive diagnostic rules, and dual inductive diagnostic rules. The commonsense reasoning rules have been divided into two classes. An algorithm DIAGaRa for inferring GMRTs has been proposed for incremental inferring of good diagnostic tests. The algorithm for inferring good tests is decomposed into subtasks and operations that are in accordance with main human common sense reasoning rules.
In chapter XIII, the authors Erhan Akdoğan, M. Arif Adlı, Ertuğrul Taçgın, and Nureddin Bennett propose a human-machine interface (HMI) to control a robot manipulator that has three-degrees of freedom for the rehabilitation of the lower limbs. This system uses a rule-based intelligent controller structure, combined with conventional control algorithms. It also has a user friendly GUI which can be used on the Internet, thereby allowing the patients to receive treatment at home. With HMI, the progress and the current state of a patient’s rehabilitation can be stored in the database. The system proposed in this chapter can handle common problems such as the transportation of patients, storage of data and availability of data of the progress of patient’s rehabilitation. The authors claim that by utilizing this system physiotherapists can treat several patients at the same time.
In the last chapter, “Congestion Control using Soft Computing”, the authors T.Revathi, and K.Muneeswaran discuss the phenomenon of network congestion. They recapitulate some of the important existing techniques for congestion control. Then they take up the congestion avoidance problem and explain the need for Active Queue Management (AQM). They review some of the AQM techniques available in the literature. Then the authors propose a soft computing technique called Fuzzy-enabled Active Queue Management (F-AQM) which addresses the influence of the queuing behavior in handling the traffic in a network. They design a fuzzy rule base represented in the form of a matrix indexed by queue length and rate of change of queue. They have studied the performance of their scheme by suitable simulation and compared the performance with that of Adaptive Virtual Queue (AVQ) techniques. It is claimed that the proposed method outperforms AVQ in reducing the number of dropped packets for different settings of Explicit Congestion Notification (ECN) and queue size.
Author(s)/Editor(s) Biography
K. Anbumani
Kalirajan Anbumani obtained his Bachelor of Engineering degree from the University of Madras (1962), Master of Engineering degree from the University of Pune (1967) and Ph.D degree from the Indian Institute of Science, Bangalore – all in India. Initially, he served in the industry for 2 years. Subsequently, he took up engineering teaching in government and private engineering colleges, Bharathiar university and Karunya University. After holding his last position as the Director, School of Computer Science and Technology, Karunya University, he has recently taken time off to engage in writing a book. Prof. Anbumani has many research publications, including chapters in books, in areas such as information security, data mining, data compression, multimedia information retrieval, soft-computing, object-oriented methodology, real-time systems and control. He has completed many funded projects and has conducted a number of conferences and chaired conference sessions. Current interest of Prof. Anbumani covers security, including data hiding in multimedia.

R. Nedunchezhian
Raju Nedunchezhian
 is currently working as the Vice-Principal of Kalaignar Karunanidhi Institute of Technology, Coimbatore, TamilNadu, India. Previously, he served as Research Coordinator of the Institute and Head of Computer Science and Engineering Department (PG) at Sri Ramakrishna Engineering College, Coimbatore. He has more than 17 years of experience in research and teaching. He obtained his BE(Computer Science and Engineering) degree in the year 1991, ME(Computer Science and Engineering) degree in the year 1997 and Ph.D(Computer Science and Engineering) in the year 2007. He has guided numerous UG, PG and M.Phil projects and organized a few sponsored conferences and workshops funded by private and government agencies. Currently, he is guiding many Ph.D scholars of the Anna University, Coimbatore and the Bharathiar University. His research interests include knowledge discovery and data mining, Soft Computing, distributed computing and database security. He has published many research papers in national/international conferences and journals. He is a Life member of Advanced Computing and Communication Society and ISTE.
Reviews and Testimonials
"The advent of soft computing marks a significant paradigm shift in computing. Currently it has a wide range of application. The various techniques of soft computing and their applications in database technologies are discussed in the chapters of this book."
- K. Anbumani

5 komentar:

{ Herry sasmito } at: 29 Mei 2012 pukul 00.12 mengatakan...

may be little that I know of this long article, but this article gave me an interesting bit of knowledge about SOFT COMPUTING APPLICATION. great article!!!!

{ Unknown } at: 29 Mei 2012 pukul 00.33 mengatakan...

that's great artikel, i like anime movie to :D

{ Global Revolutions } at: 29 Mei 2012 pukul 05.02 mengatakan...
Komentar ini telah dihapus oleh pengarang.
{ Global Revolutions } at: 29 Mei 2012 pukul 05.05 mengatakan...

{Agus Eka Jiwandanha Putra}....:

The digital revolution and the explosive growth of the Internet have helped create collection of huge amounts of useful data of diverse characteristics. Data is a valuable and intangible asset in any business today. Databases and database technologies play a crucial role in maintaining and manipulating data. Various database models exist such as relational, hierarchical, network, flat, and object-oriented. Each model organizes data in a different way to make them suitable for the intended application.

i like this....I like this article makes the motivation for the future of the business world.

{ yoga } at: 29 Mei 2012 pukul 08.04 mengatakan...

i agree with Agus Eka Jiwandanha :D

Posting Komentar

 

English Group 4 © 2011 Design by Best Blogger Templates | Sponsored by HD Wallpapers