Skip to Content

Instrukcja korzystania z Biblioteki

Serwisy:

Ukryty Internet | Wyszukiwarki specjalistyczne tekstów i źródeł naukowych | Translatory online | Encyklopedie i słowniki online

Translator:

Kosmos
Astronomia Astrofizyka
Inne

Kultura
Sztuka dawna i współczesna, muzea i kolekcje

Metoda
Metodologia nauk, Matematyka, Filozofia, Miary i wagi, Pomiary

Materia
Substancje, reakcje, energia
Fizyka, chemia i inżynieria materiałowa

Człowiek
Antropologia kulturowa Socjologia Psychologia Zdrowie i medycyna

Wizje
Przewidywania Kosmologia Religie Ideologia Polityka

Ziemia
Geologia, geofizyka, geochemia, środowisko przyrodnicze

Życie
Biologia, biologia molekularna i genetyka

Cyberprzestrzeń
Technologia cyberprzestrzeni, cyberkultura, media i komunikacja

Działalność
Wiadomości | Gospodarka, biznes, zarządzanie, ekonomia

Technologie
Budownictwo, energetyka, transport, wytwarzanie, technologie informacyjne

Journal of Computer Science

Recent events have shown that content delivery networks as well as Broadband Providers are failing to provide a continuous service, especially on live video stream transmissions for numerous customers. This study aims to present a Methodology to uninterruptedly measure uplink-downlink of a given IP connection. Based on an open-source assemblage of development and data storage platforms it was programmed a software that automatically attends the proposed assessment. The significance of availability is effusively addressed on this article, since it is the first requirement regarding quality of service in any engineered communication. The proposed method relates to the fact that for a video and/or audio web stream to successfully occur, a connection with each end-user device needs to be sustained the entire time, establishing a complex two-way communication. Meanwhile, traditional cable and satellite broadcasts are less stressful one-way connections, demanding only that the end-user device to be placed within range of a radio signal. Given this scenario and adding the substantial demand increase for high quality media content from the internet, emerges an essential need to control the service delivered by CDNs and Broadband Providers. The developed software also creates a reasonable billing mechanism, which can function as a new technical milestone on contracts and/or Service Level Agreements. This tool also imposes a key function to the user’s role, once it requires a setup of all seeked tests to be manually inserted, which might limit it for some routines.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.437.448 2014/10/02 - 19:51

Fundamental tasks for the problem of scientific support for the design of transport infrastructure in the ecosystem splits are analyzed. New aspects of simulational application in dynamic interaction of motor transport and natural systems are considered. Method for simulational studies is analyzed. The problems and advantages of the simulation of complex systems are investigated. Mathematical model of the transport infrastructure influence in the environmental situation and A* search algorithm are developed, which guaranteed to find the shortest path till the heuristic approximation, i.e., it never exceeds the real remaining distance to the target. And then this algorithm uses the heuristic in the best possible way: No other algorithm will reveal fewer joints disregarding the joints with the same cost. The successful solution of the problem results from the preliminary theoretical research, partly stated in 20 scientific publications and protected intellectual property objects, including certificates of state registration of computer programs.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.297.303 2014/10/02 - 19:51

In the field of research, there are a variety of statistical methods used. Hypothesis testing is one test that is often used to obtain the conclusions of the research conducted. One hypothesis testing that is often used is the testing of hypotheses through ANOVA (Box et al., 1978). However, in determining the validity of a hypothesis is obtained, there are assumptions that must be met, namely the assumption of homogeneity range between treatments (Box et al., 1978; Box and Cox, 1964; Gomez and Gomez, 1984; Guerrero, 1993; Mattjik and Made, 2006). Examination of a wide homogeneity assumption has been made in research involving plants, which use a randomized block design. Wide homogeneity assumption was tested using Bartlett test, but from the tests that have been done show no results were less consistent (Box et al., 1978; Box and Cox, 1964; Gomez and Gomez, 1984; Guerrero, 1993; Mattjik and Made, 2006). Transformation Box-Cox used to transform data that is not homogeneous (Box et al., 1978; Box and Cox, 1964). Calculation of the variance range and test computer-based transformation has been compiled using C # programming language.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.18.29 2014/09/24 - 15:47

The number of internet users around the world continue to grow and will continue to grow along with advances in communications technology. Communication technology which is now headed to the fourth generation (better known as 4G), allowing access into the rapidly growing information technology. Marketing media products and services not just print and electronic media. Nowadays marketing is also using the internet. Internet media is a choice for sellers of goods and services to increase sales. To make a website visited by many internet users are not just build a good interface only. Web sites that serve as a medium of marketing must be built with the correct rules, so this website can be optimized as a marketing media. The good rules in building website as a marketing media is how the content of the website indexed in search engines like google. Why is google, because 83% Internet users around the world using Google as its search engine. Search engine optimization, better known as Search Engine Optimization (SEO) is an important rule in order to more easily search the internet site with keywords penggguna we want.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.1.6 2014/09/24 - 15:47

The aim of this research paper is to study and discuss the various classification algorithms applied on different kinds of medical datasets and compares its performance. The classification algorithms with maximum accuracies on various kinds of medical datasets are taken for performance analysis. The result of the performance analysis shows the most frequently used algorithms on particular medical dataset and best classification algorithm to analyse the specific disease. This study gives the details of different classification algorithms and feature selection methodologies. The study also discusses about the data constraints such as volume and dimensionality problems. This research paper also discusses the new features of C5.0 classification algorithm over C4.5 and performance of classification algorithm on high dimensional datasets. This research paper summarizes various reviews and technical articles which focus on the current research on Medical diagnosis.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.30.52 2014/09/24 - 15:47

This study presents an experimental evaluation of Discrete Wavelet Transforms for use in speaker identification. The features are tested using speech data provided by the CHAINS corpus. This system consists of two stages: Feature extraction stage and the identification stage. Parameters are extracted and used in a closed-set text-independent speaker identification task. In this study the signals are pre-processed and features are extracted using discrete wavelet transforms. The energy of the wavelet coefficients are used for training the Gaussian Mixture Model. Daubechies wavelets are used and the speech samples are analyzed using 8 levels of decomposition.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.53.56 2014/09/24 - 15:47

There are issues of confusion between the concepts of Service Oriented Architecture (SOA) and Software as a Service (SaaS) which affect on the benefits they offer like cost reduction and agility. To solve this problem, the paper aims to explore the concepts of SaaS and SOA in order to give better understanding of these two technologies. Since SaaS cloud is getting more popular day by day and many companies are shifting to apply SaaS solutions, this motivates us to make a clear understanding of the concepts of SaaS delivery model and SOA architecture. Therefore, in this research we have reviewed the concepts and features of both SaaS cloud and SOA and then compared them with the traditional on-premise software.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.64.70 2014/09/24 - 15:47

The paper attempts to propose a fuzzy logic association algorithm to predict the risks involved in identifying diseases like breast cancer. Fuzzy logic algorithm is used to find association rules. The results of the study revealed that the prediction is better reliable than conventional methods.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.71.74 2014/09/24 - 15:47

This study proposes a new method of estimating fingerprint orientation field by utilizing prior knowledge of fingerprint images and Self-Organizing Map (SOM). The method is based on the assumption that fingerprint images have some common properties that can be systematized to build prior knowledge. In this method, each fingerprint image was divided into 16 regions equally and the regions were analyzed separately. To analyze the regions, they were divided into blocks of 8x8 pixels. Feature vector of each block was constructed by summing the pixel intensity values in row and column wise. Furthermore, feature vectors of the blocks were concatenated to form a feature matrix of the region. The matrix was then processed by SOM to find the most dominant orientation field of the region. The experiment results showed that the chosen feature gave short epochs of SOM training. In addition, the method was able to estimate the orientation field of most regions. However, the method could not precisely determine the orientation field if the blocks are dominated by background pixels.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.75.81 2014/09/24 - 15:47

Six applications which are based on mediator approach have been reviewed in this study. In practice, The mediator is used to integrate and access data from different data sources. The important characteristics for the implementation of mediator approach have been identified. This include types of data, file format and object data. These characteristics together with the advantages and the disadvantages of each implementation mediator approach described in section 4. This is important for other researchers to do the literature review. Indeed, this study highlights important issues that need to be addressed before future directions for the research in the area of database integration based on mediator is channelled.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.204.217 2014/09/24 - 15:47

Person identification based on unimodal biometric system suffers from noise, intra class similarity, non universality, distinctiveness and spoof attacks. To alleviate the problem faced in unimodal biometric system biometric traits are combined in multimodal biometric system. In this study a new approach, to improve the recognition rate, reduces computational complexity and storage space is presented. Distinct method of person identification using detection and extraction of optic disc from retina and concha from ear is carried out Region Of Interest (ROI) locator which is proposed here automatically detects the optic disc either from right or left eye and extracts it. Feature level fusion of optic disc and concha is done for recognition of a person. The method is tested with ROI locator and without ROI locator on publicly available databases and the experimental result shows that our multimodal biometric system outperforms with ROI locator than Without ROI locator. Matching Rate (MR) of 95 to 100% and Equal Error Rate (EER) of less than 10% is achieved with this system. The new approach was tested for unimodal system with ROI locator and was able to achieve 100% Matching Rate.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.153.165 2014/09/24 - 15:47

Information processing and information routing in wireless sensor network is one of the most important key area of recent research. Building an optimal routing protocol for information routing is a tough task in dealing with the sensor nodes. Information exchange can be done with the major possible way in order to provide a secure communication and reliable data delivery, here in this study we focus on routing protocol with reliable delivery. Hence we proposed a new algorithm/protocol which optimally routes the data packets from source to destination with efficient utilization of energy resources. The proposed protocol is based on next hop graph node in terms of reliable data packet routing by framing the route in the form of graphs. The simulation results showed in the performance analysis phase denotes the proposed method outperforms the existing routing protocol in terms of reliability and robustness in efficient routing. The proposed methodology achieves the connectivity and connectivity path in WSN which is critical in WSN.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.218.223 2014/09/24 - 15:47

Grid computing plays an important role in solving large-scale computational problems in a high performance computing environment. Scheduling of tasks to efficient and best suitable resource is one of the most challenging phase in grid computing systems. Grid environment reveals several challenges in efficient scheduling of complex applications because of its heterogeneity, dynamic behavior and shared resources. Scheduling of independent tasks in grid computing is dealt by a number of heuristic algorithms. This study proposes a new heuristic algorithm for mapping independent tasks in a grid environment to be assigned optimally among the available machines in a grid computing system. Due to the multi-objective nature of the grid scheduling problem, several performance measures and optimization criteria can be assumed to determine the quality of a given schedule. The metrics used here include makespan and resource utilization. This algorithm provides effective resource utilization by reducing machine idle time and minimizes makespan. This algorithm also balances load among the grid resources and produce high resource utilization with low computational complexity. The proposed algorithm is compared with other popular heuristics for performance measures.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.224.229 2014/09/24 - 15:47

Transshipments and stopovers are considered to be an effective method to reduce traveling distance where a transportation job can be served by two vehicles: One picks up a load and drops it at a transshipment point and then another vehicle carries that load to the final delivery place. The goal of this study is to develop a decision support system for open vehicle routing with transshipments and stopovers. We propose a heuristic to find transshipments and stopovers opportunities from an initial routing. Decision methods consist of four main processes: (1) Searching jobs that allow transshipment opportunity, (2) searching paths that allow transshipment opportunity, (3) matching paths and (4) selecting jobs to create new paths with transshipment. The output is the improved routing with transshipments and stopovers, resulting lower total costs. From computational experiments, our proposed method could reduce the system’s total cost up to 12.42 percent as compared to the typical routing without transshipments and stopovers. We design system database and user interfaces, considering all input requirement entering and result displays that are easily used, so that the system can be effectively applied in actual working environments.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.241.253 2014/09/24 - 15:47

Cloud storage enables users to remotely store their data and benefit of the demand high quality cloud applications without the difficulty of local hardware and software management. Though the benefits are clear, such a service is also reliable to the users’ physical possession of their outsourced data, which inevitably poses new security risks towards the recovery of the data in cloud. In order to address this new problem and further achieve a secure and useful cloud storage service, we propose in this study a flexible distributed storage integrity mechanism, utilizing the homomorphic token and distributed data. The proposed design allows users to check the cloud storage with very lightweight communication and computation cost. The auditing result not only ensures strong cloud storage correctness guarantee, efficiency, but also simultaneously to access data error localization, i.e., the identification of misbehaving server. Considering the cloud data are dynamic in nature, the proposed design future supports secure and efficient dynamic operations on outsourced data, including block modification, update, deletion and append. The proposed scheme is highly efficient and secure against Byzantine failure, malicious data modification attack and even server colluding attacks.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.254.261 2014/09/24 - 15:47

We nominate a weighted service queuing approach, each service queue is allotted a preset priority weight and Call Admission Control scheme for service classes defined in IEEE 802.16 standard. Different group of queues have been implemented to transmit different service classes. Allocation of bandwidth usage of each service classes are relying on maximum data that could take part in connection between different networks. The Quality of Service is achieved by considering throughput and delay as a major parameters along with the corresponding service classes taken into account. The scheduled CAC model with the token bucket mechanism allows increasing the performance of Quality of Service for multimedia services. By applying the weighted queuing approach, the acceptance of new connection request is increased to achieve the better Quality of Service in multimedia services over next generation wireless networks.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.427.436 2014/09/21 - 18:17

In a learning environment, a low student-lecturer ratio is considered a practical solution by many educational institutions. However, the number of students in information technology is increasing every year. This could lead to a significant increase in the workload of lecturers, who need to evaluate assignments, quizzes and projects. Hence, it is desirable that an automated assessment tool is used to lessen their workload. In the era where mobile devices are getting popularity, the high demand to execute suitable quality code is there and the cost is on the processing power of the CPU which has a direct implication on the power source or the battery used. With various implementations of cryptography algorithms available, many of them could satisfy different level of needs. In this research, we introduce the architecture for a multi-layered security of automated assessment of programming code. First, we review the existing research studies in the area. We describe the features of the tool, as part of a complex e-learning environment. We also discuss the implementation of security, to protect data transmission and storage used by the tool. Challenges the system might face and the potential solutions, are also described.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.417.426 2014/09/21 - 18:17

In hasty growth of communication, security plays a central role in maintaining confidentiality of data in group communication. Keeping the data intended for the group in confidential manner is the most important security feature need to be sustained for the group communication. An efficient group key management mechanism named as Hybrid Broadcast Group Management Protocol (HBGMP) is devised based on the Reverse Function (RF) and Chinese Remainder Theorem (CRT). The distinctive security among the subgroups is ensured by the reverse function and the session ID of each subgroup is calculated by employing Chinese remainder theorem. By contraption, the Session ID using Chinese Remainder Theorem, with which a cohort of n users requires Sub Group Service Provider (SGSP) to do O (n/m) computation for communication and the storage cost are diminished by diverting the computing load of the Group Service Provider (GSP) into the SGSP. The significance of this protocol is the group member needs to store only two different values during the entire life span and also the rekey message is broadcasted which brings down the communication cost to O(1). The protocol is defined generally for any applications in hybrid architecture. The proposed architecture using CRT and Reverse function is scalable for hefty sized dynamically changing group.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.344.350 2014/09/21 - 18:17

In automated pulmonary nodules extraction and lung disease diagnosis by image processing techniques, image segmentation is utilized as a primary and the most essential step of lung tumor analysis. But due to extensive similarity between pulmonary vessels, bronchus and arteries in lung region and the low contrast of the Computed-Tomography (CT) image the accuracy of lung tumor diagnosis is highly dependent on the precision of segmentation. Therefore, precise lung CT image segmentation has become a challenging preprocessing task for every lung disease pathological application.In this study, a novel Four-Directional Thresholding (FDT) technique is introduced. This propounded technique segments the pulmonary parenchyma in Computed-Tomography (CT) images using the Similarity-Based Segmentation (SBS). The proposed technique aims to augment the precision of the CT image thresholding by implementing an advanced thresholding approach from four different directions in which the determination of pixels’ value as being either on foreground or background is highly dependent on its adjacent pixel’s intensity value and the final decision is made based on all four directions’ thresholding results. In this study the importance of neighbor pixels in precision of thresholding with FDT technique is demonstrated and the effectiveness of FDT method has been evaluated on different CT images. Eventually the result of segmentation using FDT method is compared by other precursors techniques, which corroborates the high exactitude of proposed technique.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.195.203 2014/09/21 - 18:17

Websites on the internet are useful source of information in our day-to-day activity. Web Usage Mining (WUM) is one of the major applications of data mining, artificial intelligence and so on to the web data to predict the user’s visiting behaviours and obtains their interests by analyzing the patterns.WUM has turned out to be one of the considerable areas of research in the field of computer and information science. Weblog is one of the major sources which contain all the information regarding the users visited links, browsing patterns, time spent on a page or link and this information can be used in several applications like adaptive web sites, personalized services, customer profiling, pre-fetching, creating attractive web sites etc. WUM consists of preprocessing, pattern discovery and pattern analysis. Log data is typically noisy and unclear, so preprocessing is an essential process for effective mining process. In the preprocessing phase, the data cleaning process includes removal of records of graphics, videos, format information, records with the failed HTTP status code and robots cleaning. In the second phase, the user behaviour is organized into a set of clusters using Weighted Fuzzy-Possibilistic C-Means (WFPCM), which consists of “similar” data items based on the user behaviour and navigation patterns for the use of pattern discovery. In the third phase, classification of the user behaviour is carried out for the purpose of analyzing the user behaviour using Adaptive Neuro-Fuzzy Inference System with Subtractive Algorithm (ANFIS-SA). The performance of the proposed work is evaluated based on accuracy, execution time and convergence behaviour using anonymous microsoft web dataset.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.372.382 2014/09/21 - 18:17

On road vehicles have increased in numbers and monitoring them is a challenging task. In common areas of public crowd people and vehicles are common objects for monitoring. The proposed system aims at detecting number plate information indicating the possibility for security relevant issues. Existing system performs recognition mainly by using license plate alone. Addition of the features (logo, colour, shape) will increase the security of the system. Identification of the number plate region has been done by Blob detection method at the predefined aspect ratio. After detection, extraction of the number plate information using Eigen value regularization method. Further, two methodology included in this study are, identifying the tampered region in a car image either by extracting HoG feature in the spatial domain or block differences in DCT coefficients and their corresponding histogram in the transform domain respectively. Experimental results for the given car dataset describes the identification of the number plate region and tampered region quantitatively. The work presents detailed results of how the proposed approach gives better results using HoG approach. The approach gives good results in videos of cars recorded in frontal view in good lighting conditions. The paper in overall suggest a hybrid approach for detecting number plate information in cars taken in good lighting conditions.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.304.314 2014/09/21 - 18:17

This study presents a comparison of recognition performance between feature extraction on the T-Zone face area and Radius based block on the critical point. A T-Zone face image is first divided into small regions where Local Binary Pattern (LBP) histograms are extracted and then concatenated into a single feature vector. This feature vector will further reduce the dimensionality scope by using the well established Principle Component Analysis (PCA) technique. On the other hand, while the original LBP techniques focus in dividing the whole image into certain regions, we proposed a new scheme, which focuses on critical region, which gives more impact to the recognition performance. This technique is known as Radius Based Block Local Binary Pattern (RBB-LBP). Here we focus on three main area which is eye (including eyebrow), mouth and nose. We defined four critical point represent left eye, right eye, nose and mouth, from this four main point we derived the next nine point. This approach will automatically create the redundancy in various regions and for every radius size window a robust histogram with all possible labels constructed. Experiments have been carried out on the different sets of the Olivetti Research Laboratory (ORL) database. RBB-LBP obtained high recognition rates when compared to standard LBP, LBP+PCA and also on T-Zone area. Our result shows of 16% improvement compared with LBP+PCA and 6% improvement compared with LBP. Our studies proves that the RBB-LBP method, reduce the length of the feature vector, while the recognition performance is improved.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.96.108 2014/09/17 - 01:06

Email based communication over the course of globalization in recent years has transformed into an all-encompassing form of interaction and requires automatic processes to control email correspondence in an environment of increasing email database. Relevance characteristics defining class of email in general includes the topic of thee mail and the sender of the email along with the body of email. Intelligent reply algorithms can be employed in which machine learning methods can accommodate email content using probabilistic methods to classify context and nature of email. This helps in correct selection of template for email reply. Still redundant information can cause errors in classifying an email. Natural Language Processing (NLP) possess potential in optimizing text classification due to its direct relation with language structure. An enhancement is presented in this research to address email management issues by incorporating optimized information extraction for email classification along with generating relevant dictionaries as emails vary in categories and increases in volume. The open hypothesis of this research is that the underlying concept to fan email is communicating a message in form of text. It is observed that NLP techniques improve performance of Intelligent Email Reply algorithm enhancing its ability to classify and generate email responses with minimal errors using probabilistic methods. Improved algorithm is functionally automated with machine learning techniques to assist email users who find it difficult to manage bulk variety of emails.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.109.119 2014/09/17 - 01:06

The requirement of online users in the website varies dynamically. The recommendation of web pages consisting of user expected information and data is performed by the online recommendation system. The recommendation engine must be self-adaptive and accurate. The existing algorithm uses Depth First Search (DFS) and bee’s foraging approach to create navigation profiles by categorizing the current user activity. The prediction of navigations that are most expected to be visited by online users is also performed. In this study, the recommendation engine formation with optimized resource such as memory, CPU usage and minimum time consumption is proposed using DFS and Genetic Approach (GA). Here, initially the cluster formation is achieved using DFS approach. The method creates an eminent browsing pattern for each user using live session window. The performance of the approach is compared with the existing forager agent. The experimental results show that the proposed approach outperforms the existing methods in accomplishing accurate classification and anticipation of future navigation for the current online user.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.188.194 2014/09/17 - 01:06

In this research, a numerical integration method is proposed to improve the computational accuracy of Legendre moments. To clarify the improved computation scheme, image reconstructions from higher order of Legendre moments, up to 240, are conducted. With the more accurately generated moments, the distributions of image information in a finite set of Legendre moments are investigated. We have concluded that each individual finite set of Legendre moments will represent the unique image features independently, while the even orders of Legendre moments describe most of the image characteristics.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.127.136 2014/09/17 - 01:06

The motivation of the proposed compression method is to reduce the bit rate for image transmission or memory requirement for image storage while maintaining image quality. The edges are one of the prominent features in an image and they are essential for maintaining image quality. JPEG compression standards like JPEG98 and JPEG2000 produce visual artifacts in reconstructed image at low bit rate because, they didn’t tailored about the detailed information like edges. Hence second generation coding introduced to preserve edge information, in which approximation and detailed information are separately encoded, so that, it introduces additional computational time and complexity. A multi directional anisotropic shearlet transform provides an optimally efficient representation of images with edges whereas wavelet transform have limited capability in dealing with edge information in all directions. Here, multidirectional transform called extended shearlet transform is used to uncorrelate the input gray level values with edge preserving capabiltiy. Hard thresholding method is applied to transform coefficients and finally threshold output is encoded using Set Partitioning In Hierarchical Trees (SPIHT) technique. The comparative analysis is performed between Edge Preserved Wavelet Transform coding (EPWT) and the extended shearlet transform coding. Image quality is measured objectively using peak signal-to-noise ratio, Structural Similarity Index (SSIM) and subjectively, using perceived image quality. The simulation results show that, the extended shearlet based compression technique is more efficient than EPWT coding technique for wide range of geometrical features of the images. Quantitative analysis on standard test images show that the proposed technique outperforms the EPWT coding technique by 0.16 dB to 1.46 dB of PSNR with less computational time.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.82.88 2014/09/17 - 01:06

Multimodal retinal imaging is an important facet of the diagnosis and therapy of retinal disorders like retinopathy, occlusion. Many imaging techniques for multimodal retinal images are developing in the recent years. These novel developments are subjected to a tradeoff between the computational time and effective registration. This study aims at developing a new algorithm based on the RANSAC matching and gradient iterative closest point technique which has proven to have less computational time with the best matched coordinates irrespective of the nature of the input retinal image. This study uses a new adaptive thresholding technique to extract the bifurcations from the target image and the control points are selected using the RANSAC matching algorithm. The registration is achieved by implementing gradient iterative closest point algorithm to minimize the mean square error between the target control points of the base and the reference images.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.89.95 2014/09/17 - 01:06

The Mitogen Activated Protein Kinases (MAPK) cascade has been used as a case study for different computational software tools in both modelling and simulation of real-life problems. This study focuses on the development of new techniques for solving a MAPK cascade by using membrane computing. Membrane computing is an unconventional computational approach that provides a platform for modelling discrete system. This approach deals with parallel, distributed and non-deterministic computing models. P-Lingua is used to specify and analyze a wide range of quantitative properties and to offer a general syntactic framework that could be a formal standard for membrane computing. The model is simulated by using MeCoSim, a membrane computing simulator used to verify and validate the model. This study aimed to compare the use of the membrane computing approach for modelling the MAPK cascade with other models. The MAPK cascade, which is specified in P Lingua evaluated based on the simulation with MeCoSim. In general, we can say that the membrane computing model is better than previous models because it takes into account the changes that occur in the cell and because the problem is mainly a problem in the cells; therefore the model achieves the best results and reliability.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.178.187 2014/09/17 - 01:06

With the growing industrial impact over the recent years in computer science, data mining has established itself as one of the most important disciplines. In the fast growing Web and in an appropriate amount of time, locating the resources that are precise and relevant is a huge challenge for the all-purpose single process crawlers, which makes the enhanced and the convincing algorithm in demand. Gradually Large scale search engines frequently update their index and in a timely behavior which are not capable to present such information. In this study a scalable focused crawling is proposed with an incremental parallel Web crawler, the Web pages can be crawled concurrently that are relevant to multiple pre-defined topics. Furthermore, to solve the issue of URL distribution, a compound decision model based on multi-objective decision making method is introduced, which will consider multiple factors synthetically such as load balance and relevance, the update frequency issue can be solved by the local repository decision. The result shows that our proposed system will efficiently produce high quality, relevance and freshness with significantly low memory requirement.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.120.126 2014/09/14 - 14:47

With the growing industrial impact over the recent years in computer science, data mining has established itself as one of the most important disciplines. In the fast growing Web and in an appropriate amount of time, locating the resources that are precise and relevant is a huge challenge for the all-purpose single process crawlers, which makes the enhanced and the convincing algorithm in demand. Gradually Large scale search engines frequently update their index and in a timely behavior which are not capable to present such information. In this study a scalable focused crawling is proposed with an incremental parallel Web crawler, the Web pages can be crawled concurrently that are relevant to multiple pre-defined topics. Furthermore, to solve the issue of URL distribution, a compound decision model based on multi-objective decision making method is introduced, which will consider multiple factors synthetically such as load balance and relevance, the update frequency issue can be solved by the local repository decision. The result shows that our proposed system will efficiently produce high quality, relevance and freshness with significantly low memory requirement.

http://www.thescipub.com/abstract/10.3844/jcssp.2014.120.126 2014/09/13 - 14:04

Now-a-days the images acquired by the digital cameras and defective sensors tend to introduce noises during either image acquisition or transmission process. The quality of the image is degraded in a significant measure. Lot of research works was carried out for several decades to denoise the impulse noise and each approach has its own merits and demerits. This study deals with a new denoising approach for the gray scale images to discard fixed type salt and pepper noise present in the images. This algorithm was implemented for gray scale images such as Lena and cameraman and the performance results are really challenging both qualitative and quantitative wise. This study considered the performance metrics like PSNR and MSE for quantitative measure and presents better results for low density noise level to high density noise level (up to 100%), when compared to other existing filters. The visual interpretation shows that this method proves better in qualitative analysis by human perception too. In addition to this the proposed approach decreases the computational and hardware complexity by an appreciable manner since traditional sorting schemes does many comparisons and that were very much avoided. Thus very fast operation could be achieved. This study deals with neighborhood pixel comparison which are confined to previous pixel and the pixel next to the processing pixel under consideration, the absence of sorting saves much time and number of operations, which in turn speed of operation is increased and better reconstruction of images is achieved.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.57.63 2014/09/12 - 11:26

An Efficient Image Generation Algorithm is proposed. It generates Message Authentication Image (MAI) by using Fractals and chaos theory. The fractal images are generated by using Iterated Function System (IFS) techniques. We implemented and generated the fractal images exploring the properties of chaos. Chaos is an unpredictable behavior arises in a dynamical system so that the future behavior is not in a predictable way. Chaos is based on the initial condition that is generated by Pseudo Random Number Generator (PRNG). The chaotic behavior of the system is also analyzed. We use these Fractal images as a digital signature. This technique can be employed in online transactions like Banking, Shopping. to avoid phishing and also we can watermark this fractal image and use it for government and private identification proofs.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.315.324 2014/09/12 - 11:26

With the growth in computing power, speech recognition carries a strong potential in the near future. It has even become increasingly popular with the development of mobile devices. Presumably, mobile devices have limited computational power, memory size and battery life. In general, speech recognition operation requires heavy computation due to large samples per window used. Fast Fourier Transfom (FFT) is the most popular transform to search for formant frequencies in speech recognition. In addition, FFT operates in complex fields with imaginary numbers. This paper proposes an approach based on Discrete Tchebichef Transform (DTT) as a possible alternative to FFT in searching for the formant frequencies. The experimental outputs in terms of the frequency formants using FFT and DTT have been compared. Interestingly, the experimental results show that both have produced relatively identical formant shape output in terms of basic vowels and consonants recognition. DTT has the same capability to recognize speech formants F1, F2, F3 on real domains.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.351.360 2014/09/12 - 11:26

This study presents an image enhancement approach to Cuckoo Search Algorithmin with Morphological Operation. At the present time, in many image processing applications digital images are developed. Machine vision, computer interfaces, manufacturing, compression for storage and more are some of the fields of image processing application. Before using it in any applications the image has to be managed, such processing is said to be image enhancement. We propose a method to combine with an enhancing digital images through cuckoo search algorithmin and morphological operation. Therefore, the appearance of noise produces distortion in an image and thus the image will be unattractive. This decreases the discernibility of many features inside the images. In this study, we are working to overcome this drawback by getting an improved contrast value after converting the color image into grayscale image. The fundamental characteristic of this CS algorithm is that the amplitudes of its components can objectively reflect the contribution of the gray levels to the representation of image information for the best contrast value of an image. After selecting the best contrast value of an image in CS algorithm, morphological operations have to be done. In morphological operations, the intensity parameters of the image are adjusted to improve its quality. Experimental results demonstrate that the proposed approach is converted into original color image without noise and adaptive process to enhance the quality of images.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.7.17 2014/09/12 - 11:26

Computational stylometry is the field that studies the distinctive style of a written text using computational tasks. The first task is how to define quantifiable measures in a text and the second is to classify the text into a predefined category. This study propose a stylometric features selection approach evaluated by machine learning algorithms to find the finest of the features and to study the impact of the features selection on the classifiers performance in the domain of oath statement in the Quranic text. The results show that better classifiers performance is highly affected by the best feature selection which is associated to an explicit oath style.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.325.329 2014/09/09 - 14:17

On road vehicles have increased in numbers and monitoring them is a challenging task. In common areas of public crowd people and vehicles are common objects for monitoring. The proposed system aims at detecting number plate information indicating the possibility for security relevant issues. Existing system performs recognition mainly by using license plate alone. Addition of the features (logo, colour, shape) will increase the security of the system. Identification of the number plate region has been done by Blob detection method at the predefined aspect ratio. After detection, extraction of the number plate information using Eigen value regularization method. Further, two methodology included in this study are, identifying the tampered region in a car image either by extracting HoG feature in the spatial domain or block differences in DCT coefficients and their corresponding histogram in the transform domain respectively. Experimental results for the given car dataset describes the identification of the number plate region and tampered region quantitatively. The work presents detailed results of how the proposed approach gives better results using HoG approach. The approach gives good results in videos of cars recorded in frontal view in good lighting conditions. The paper in overall suggest a hybrid approach for detecting number plate information in cars taken in good lighting conditions.

http://www.thescipub.com/abstract/10.3844/jcssp.2014.304.314 2014/09/07 - 08:35

In modern wireless communication world, the security of data transfer has been the most challenging task. In embedded system, AES is the most extensively used cryptographic algorithm in practice. But its functionality has been disrupted by the DPA attack. There have been several countermeasures to tackle those attacks, but this study proposes variably a new measure to defend this DPA attack. DPA attack is possible due to the power fluctuation happening due to sequential circuit clocking during the process of substitute byte in AES encryption in the first round and last round. Hence to prevent this, the power variation is maintained at a constant pace throughout the data processing. This is achieved by incorporating a combinational logic design instead of a sequential logic circuit in AES. The proposed design is implemented in Vertex III FPGA device and found even after 17230 power traces the secret key is not disclosed as the power fluctuations is completely random. The power consumption when experimented by micro wind software proves to be constant and the same power (almost) is obtained while implementing it hardware and no chance of identifying the instant of data processing is achieved.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.291.296 2014/09/02 - 20:13

The comparative analysis of the renowned cryptographic algorithms AES, DES and RSA. The Rijndael algorithm was adapted as Advanced Encryption Standard (AES) algorithm, to Data Encryption algorithm (DES), which have been in the security standards since long time. The comparative analysis is implemented in IEEE 802.11i wireless platform. Compared to DES, AES contains CCMP which is a security standard that provides the highest level of security to encrypt and authenticate the data simultaneously. CCMP protocol is to provide robust security. The CCMP protocol is based on Advanced Encryption Standard (AES) encryption algorithm. It uses the Counter Mode with CBC-MAC (CCM) mode of operation. The CCM mode combines Counter (CTR) mode for privacy and Cipher Block Chaining Message Authentication Code (CBC-MAC) for authentication. The implementation is done on the NS-2 platform to compare and analyzes the performance of this with DES and RSA algorithms, based on the following three criteria: (a) Bit rate; (b) Packet delay; and (c) The number of packets. Thus the motivation is to provide a secure data transfer in the wireless medium in IEEE 802.11i.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.283.290 2014/08/27 - 19:27

In this study, we focus on two objectives: (1) To raise awareness of the computing field in three groups of students-high school, freshmen in their first term of university (i.e., students taking the Introduction to IT course) and freshmen in their second term of university (i.e., students taking the Programming I course); and (2) to organize visible CS/IT activities (i.e., robots, the Google Developer Group (GDG), Hour of Code, etc.) and involve as many students as possible. We conducted a detailed survey among these three groups of students to measure the effectiveness of making presentations on CS/IT and determine whether or not awareness of computer science increased as a result. As well, we organized a series of CS/IT activities and events. We measured the effectiveness of the Hour of Code activity on 515 students from grades 1 to 5. The survey results were promising and we conclude that such efforts should continue to be a topic of research into student enrollment growth in the future academic terms. It is also framed to benefit faculty members, administrators and others throughout the global community.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.145.152 2014/08/25 - 18:15

Accurate detection of the End-Diastolic (ED) and End-Systolic (ES) frames of a cardiac cycle are significant factors that may affect the accuracy of abnormality assessment of a ventricle. This process is a routine step of the ventricle assessment procedure as most of the time in clinical reports many parameters are measured in these two frames to help in diagnosing and dissection making. According to the previous works the process of detecting the ED and ES remains a challenge in that the ED and ES frames for the cavity are usually determined manually by review of individual image phases of the cavity and/or tracking the tricuspid valve. The proposed algorithm aims to automatically determine the ED and ES frames from the four Dimensional Echocardiographic images (4DE) of the Right Ventricle (RV) from one cardiac cycle. By computing the area of three slices along one cardiac cycle and selecting the maximum area as the ED frame and the minimum area as the ES frame. This method gives an accurate determination for the ED and ES frames, hence avoid the need for time consuming, expert contributions during the process of computing the cavity stroke volume.

http://www.thescipub.com/abstract/10.3844/jcssp.2015.230.240 2014/08/25 - 18:15