Skip to Content

Instrukcja korzystania z Biblioteki

Serwisy:

Ukryty Internet | Wyszukiwarki specjalistyczne tekstów i źródeł naukowych | Translatory online | Encyklopedie i słowniki online

Translator:

Kosmos
Astronomia Astrofizyka
Inne

Kultura
Sztuka dawna i współczesna, muzea i kolekcje

Metoda
Metodologia nauk, Matematyka, Filozofia, Miary i wagi, Pomiary

Materia
Substancje, reakcje, energia
Fizyka, chemia i inżynieria materiałowa

Człowiek
Antropologia kulturowa Socjologia Psychologia Zdrowie i medycyna

Wizje
Przewidywania Kosmologia Religie Ideologia Polityka

Ziemia
Geologia, geofizyka, geochemia, środowisko przyrodnicze

Życie
Biologia, biologia molekularna i genetyka

Cyberprzestrzeń
Technologia cyberprzestrzeni, cyberkultura, media i komunikacja

Działalność
Wiadomości | Gospodarka, biznes, zarządzanie, ekonomia

Technologie
Budownictwo, energetyka, transport, wytwarzanie, technologie informacyjne

Journal of Software Engineering

Coverage-based fault localization is a statistical technique that assists developers in finding faulty entities efficiently by contrasting program traces. Although coverage-based fault localization has been shown to be promising, its effectiveness still suffers from occurrences of coincidental correctness which means test cases exercise faulty statements but do not result in failure information. Recent researches indicate that coincidental correctness is a common problem in software testing and harmful for fault localization. To enhance effectiveness of fault localization, in this study, we present a clustering approach to identify coincidental correctness in test suites for fault localization. An effective clustering technique is used to group test cases. Then we present an adaptive sampling strategy to identify and choose potential coincidentally correct tests from clusters such that the number of the identified coincidentally correct tests is guaranteed to be no more than the actual number of coincidentally correct tests in the test suite. Three representative fault localization techniques are evaluated to see whether they can benefit from identified coincidentally correct tests. The experimental results show that our approach can alleviate the coincidental correctness problem and improve the effectiveness of fault localization.

http://scialert.net/abstract/?doi=jse.2014.328.344 2014/08/07 - 16:13

The overwhelming increasing the fuel price and exhausting of fossil fuel resources in Malaysia is alarming, therefore, renewal sources of energy should be further studied. There are attempts in using biomass as renewable energy resources to produce producer gas which can be used as fuel, however, the shortcomings and inefficiencies in the available technology have made it uneconomical. The CO2 in the producer gas which is incombustible from the biomass gasification must be removed to enhance its quality. For that, a device called Bubbling Fluidized Bed CO2 Absorption Reactor (CO2 BFBAR) has to be developed. Calcium oxide (CaO) derived from natural limestone produced in Malaysia (blessed with abundant reserve of limestone resources) can be an effective sorbent to absorb carbon dioxide (CO2) gas at high temperature. In the present study, the simulation of the hydrodynamic characteristics in the CO2 BFBAR is done. The objective is to find out whether the 50% CaO-sand mixture used in the reactor can perform fluidization perfectly without sticking out. The three dimensional geometry and mesh generation of the CO2 BFBAR was build using ANSYS FLUENT CFD software and then the geometry is exported to the Computational Fluid Dynamics (CFD) software called Fluent v6.2.16 for analysis. By applying the constant volume flow rate, Q from 15 to 55 L min-1, the Fluent simulated the bubbling height of CaO-sand material and the bubbling behavior in the CO2 bfbar versus time for three types CaO particle sizes (100, 500 and 1000 μm). After that, the results from the computer simulation analysis are compared to the cold model experiment results. Overall, it shows that the simulation results give a good agreement with the experiment results. The percentages difference obtained for three types CaO particle sizes, are below than 5.3%. All results obtained have been used to design the CO2 BFBAR.

http://scialert.net/abstract/?doi=jse.2014.345.360 2014/08/07 - 16:13

Predicting mortality of ICU patients with high accuracy is an active research in clinical domain during the past decades. However, the special features of ICU data such as high-dimensional, uncertain sampling and imbalanced distribution makes the prediction challenging. In this study, a hierarchical data model is proposed to describe the special feature of ICU data. A hybrid framework with clustering and machine learning algorithm is used to convert the ICU time series with special data property to the traditional time series data so that time series data mining can be applied. High dimension of ICU data is reduced by time series clustering while, uncertain sampling is settled by certainty strategy. Additionally, the unknown medical knowledge exists both within variables and among variables which is difficult to extract. To address this issue, proposed framework used clustering to extract knowledge within the variable and knowledge among variables is extracted by classical machine learning algorithms. Experimental results show prediction accuracy of proposed hybrid framework is better than the data mining methods which do not consider analysis of ICU data properties and additionally, more efficient results can be achieved with suitable choice of sampling frequency.

http://scialert.net/abstract/?doi=jse.2014.361.374 2014/08/07 - 16:13

In this study, a kind of production control policy is developed for a single-machine, multi-type products, unreliable manufacturing system with defective items, which is used to minimize the average production cost. Compared with the theoretical value, it proves feasible. It is proved optimal when the system meets to a special condition, complete analytical solutions of hedging points and an average inventory/backlog cost is obtained and the relationship between hedging points and system parameters is given. In the solution part, a computer simulation method combines with particle swarm algorithm is proposed to get the approximate value of hedging points. The influence on hedging points of initial states is discussed. Simulation results demonstrate that the method can also be applied to a system under general situation.

http://scialert.net/abstract/?doi=jse.2014.375.386 2014/08/07 - 16:13

The econometric error correction model was used to study the impact and contribution
of the science and technology investment on local economic growth in Fuyang
city in the northern of Anhui province, China. Analysis of the short-term fluctuations
and long-run equilibrium correlation between the economic growth and the science
and technology investment was done too in Fuyang city. It was found that there
was a statistically significant correlation between the amount of the science
and technology investment and the local economic growth from the point of view
of the contribution and promoting elasticity coefficients, i.e., 0.171785 in
the long term and 0.340016 in the short term. It was estimated that the regional
growth of GDP in Fuyang city would be driven by 0.171785 in the long term and
0.340016 in the short term when the science and technology investment was increased
by 1% according to the elasticity coefficients. Moreover, the index of potential
promoting efficiency of the science and technology investment on the regional
GDP was very remarkable with the Q statistic of 9.497250794. The present study
indicated that the potential effect and contribution of promotion of the science
and technology investment on local economic growth in Fuyang city was very
huge.

http://scialert.net/abstract/?doi=jse.2014.387.398 2014/08/07 - 16:13

In this study, an image segmentation using automatic selected threshold method based on improved genetic algorithm is presented. It can overcome the shortcomings of the existing image segmentation methods which only consider pixel gray value without considering spatial features and computational complexity of these algorithms is too large. Encoding, crossover, mutation operator and other parameters of genetic algorithm are improved moderately in this method. Optimal threshold for image segmentation is converted into an optimization problem in this new method. The selection algorithm is optimized by using simulated annealing temperature parameters to achieve selective pressures. In order to achieve image segmentation, the optimal threshold is solved by using optimizing efficiency of improved genetic algorithm. Simulation results show that the new algorithm greatly reduces the optimization time and enhances the anti-noise performance of image segmentation and improves the efficiency of image segmentation. Thus, the new method can facilitate subsequent processing for computer vision and can be applied to real-time image segmentation.

http://scialert.net/abstract/?doi=jse.2014.399.408 2014/08/07 - 16:13

The accurate performance evaluation of Information Technology (IT) application provides a direction for the enterprise information construction. The new internet of things has a great uncertainty of effective landing in the industry, therefore, the evaluation and analysis of specific benefits for the enterprise application by the internet of things technology can supply the important guidance for the application of internet of thing. This study proposes to establish a new evaluation method for IT application of enterprise integrating the Performance Reference Model of Federal Enterprise Architecture Framework and analytic hierarchy process method. The proposed methodology is applied in the evaluation analysis of Internet Of Things (IOT) applications for the oil and gas pipeline enterprise. The results show that the proposed method can help the oil and gas pipeline enterprise to make prediction analysis for the IOT application which can provide the scientific basis for the IOT application of enterprise.

http://scialert.net/abstract/?doi=jse.2014.409.418 2014/08/07 - 16:13

Alarm association rules mining is an important task in system fault diagnosis
and localization. Once the system fails, it will produce a large number of alarm
information. By analyzing the characteristics of the booking system alarm data,
this study puts forward alarm association rules mining algorithm based on sliding
time window model to find the fault source and the correlation between fault
factors in a large number of alarm information. The experiments show that the
valuable alarm association rules can be acquired from the alarm data accurately
and rapidly. These rules can provide support decision for the system maintenance
personnel.

http://scialert.net/abstract/?doi=jse.2014.419.427 2014/08/07 - 16:13

The pricing of residential water and its supply, especially in water-deficient
inland area, are very crucial and urgent for urban people’s
living. Fuyang city is a typical inland metropolis being shortage of water resources
in China. In order to evaluate the determinants and impacting factors of water
pricing, an empirical study was carried out using multivariate regression model
to analyze the correlations among residential water price and urban disposable
income and water consumption. It was found that the current urban water price
is negatively and significantly correlated with the urban per capita disposable
income and average water consumption. Confronted with the current increasingly
scarce situation of urban water resources and difficulties encountered in the
pricing of residential water, some counter measures and suggestions were eventually
put forward in order to allocate the urban water resources reasonably in Fuyang
city.

http://scialert.net/abstract/?doi=jse.2014.287.303 2014/06/24 - 14:50

OpenFlow makes up for traditional network, but network session identification
is inefficient and packet forwarding path selection is poor. Focusing on forwarding
path and matching; we propose OGMD model which combin GPU with biological sequence
algorithms and machine learning methods to accelerate matching speed and to
improve network environment, we also proposed matching algorithm and path selection
algorithm. Experiments show matching algorithm gives a speedup of 325 and path
selection algorithm makes link loss rate less than 5%, with an average decline
69.35% and network delay less than 20 msec, average fell 63.28%.

http://scialert.net/abstract/?doi=jse.2014.304.313 2014/06/24 - 14:50

A Web service process is well-structured, means its branch flows are splited
and converged correctly in the logical structure. Well-structure is an import
property to guarantee a service process to work normally. To detect the well-structured
property of Web service process described by WS-BPEL, a logical Petri net method
is proposed. The service process based on WS-BPEL is modeled as a service net
form logical petri nets and the well-structured property is mapped into the
features of logic expressions in the logical transitions of the service net.
The formal definition and decision method of well-structure is proposed and
the examples are also provided to show how to detect whether a service net is
well-structured or not. From the proposed method in this study, the well-structured
property of WS-BPEL can be easily analyzed.

http://scialert.net/abstract/?doi=jse.2014.314.320 2014/06/24 - 14:50

A new method is proposed instead of the periodic boundary conditions in this
study which treats the inlet or outlet boundaries with virtual following particles
in the Smoothed Particle Hydrodynamics (SPH). The virtual following particles
are those located outside the inlet or outlet boundaries and their velocities
and positions are updated according to the corresponding interior particles.
This method bases on the analysis of characteristics of laminar flow and is
able to be applied in low Reynolds number flow which is laminar flow in the
inlet or/and outlet boundaries. The Poiseuille flow and two-dimensional flow
between two inclined plates are simulated, respectively with the proposed method
and the results are in good agreement with theoretical values.

http://scialert.net/abstract/?doi=jse.2014.321.327 2014/06/24 - 14:50

According to the energy constrained, low-storage space and limited computing
ability of wireless sensor network nodes, an intrusion detection model based
on GA-LMBP was proposed. Compared with traditional methods, the program takes
advantage of offline learning neural network algorithm to build detection model
without storing large amounts of intrusion features, saving storage resources.
Compared with the use of promiscuous mode capturing data, multi-detection cooperative
mechanism reduces energy consumption. Simulation results show that GA-LMBP intrusion
detection model in terms of performance, energy consumption, storage costs,
the detection rate and false detection rate is better than those of traditional
methods.

http://scialert.net/abstract/?doi=jse.2014.225.238 2014/06/21 - 11:19

An efficient algorithm for shot boundary detection is proposed in this study.
In order to eliminate the disturbances caused by illumination and camera motion,
Dual Tree Complex Wavelet Transform (DTCWT) is used in the process of raw feature
generation. Then, the enhanced texture features were extracted from the six
detailed subbands by calculate the mean and standard deviation of the high-frequency
coefficients. Non-negative Matrix Factorization (NMF) is used for extracting
structure feature from the low-frequency sub-bands image and dynamic threshold
is used for shot boundaries detection. The experiment result conducted on TRECVID
2001 test data and other test videos shows that the proposed scheme can not
only overcome illumination and motion effects efficiently, but also achieve
a high detection speed.

http://scialert.net/abstract/?doi=jse.2014.239.251 2014/06/21 - 11:19

This study is attempting to suggest a way to translate strategic objectives
into operational business goals. The main objective is more precisely to discuss
the mappings that might be done between intention-oriented language and business
process modeling language in order to align the intentional level with the operational
level. In our view, such an alignment between these levels can help the software
designers in transforming easily the business requirements into business process
descriptions. The idea is to propose an approach of a mapping bridging the gap
between a model of requirement and a model of business. This approach is illustrated
using MAP as an intention-oriented language mainly intended to describe business
requirements in intentional level. The first application of this language concerns
the field of Information System Engineering in order to model process on a flexible
way and Business Process Modeling Language (BPMN) as a graph-oriented modeling
notation targeted to model business goals in operational level. Based on these
mappings, an example is presented that is illustrating the translation from
the MAP process model element to BPMN.

http://scialert.net/abstract/?doi=jse.2014.252.264 2014/06/21 - 11:19

A very few research studies discussed the employment of data mining techniques
in the field of IS success/effectiveness assessment. For this reason, the purpose
of this study is to employ data mining techniques in the evaluation of Information
System (IS) effectiveness, particularly classification method. This important
issue helps and supports decision makers and IT managers towards the development
of information system quality in order to be consistent with user needs and
expectations. A reasonable data set of 255 subjects are collected through using
a questionnaire of six dimensions including five quality factors (system quality,
information quality, service quality, user interface quality and communication
quality) and user satisfaction. This study attempts to employ the data mining
techniques to develop a model for supporting the prediction of the user satisfaction
with IS inside the international organizations. To validate the generated model,
several experiments were performed based on real data collected from the international
organization employees. The encouraging results of experiments show that this
model has a sound prediction to decide regarding the level of user satisfaction
toward the employed IS. Also the results indicate that the tree classification
algorithm J48 is the best in doing classification in case of supervised target
of two values. It is important to mention that the results consistent with the
regression analysis and could contribute to the related empirical studies.

http://scialert.net/abstract/?doi=jse.2014.265.277 2014/06/21 - 11:19

The present study proposes a dynamic replica management strategy based on Technique
for Order Preference by Similarity to Ideal Solution (TOPSIS) in the cloud storage
system. The replica management strategy includes the replica placement algorithm
and the replica number strategy. The replica placement algorithm sorts the nodes
by TOPSIS according to the node performance and load, the replicas are placed
in the nodes with high performance and low load. The replica number strategy
designs the reliability model and the availability model to compute the number
of replicas, the initial number of the replicas is decided according to the
reliability model and the number of the replicas is adjusted dynamically according
to the availability model. Experimental results show that the strategy in the
study has better performance in response delay and load-balance than the Default
Replica Management Strategy in Hadoop (DRMSH) distributed file system.

http://scialert.net/abstract/?doi=jse.2014.278.286 2014/06/21 - 11:19

This study proposes a Particle Swarm Optimization-based Augmented Lagrangian (PSOAL) algorithm which combines particle swarm optimization technique with a non-stationary penalty function method to solve constrained optimization and engineering design problems. A set of novel strategies are developed based on the particle feasibility to adaptively update critical parameters and a point-based local search procedure is embedded within the algorithm framework to improve the convergence property of the proposed algorithm. The 13 well-known constrained benchmark problems are solved and the obtained results are compared with other state-of-the-art algorithms. The results demonstrate that, the proposed PSOAL achieves higher accuracy compared to other considered algorithms. In addition, as an added benefit, PSOAL can also easily find out the Lagrange multipliers, which have great value for sensitivity analysis in practice but are almost not considered in most intelligent algorithms designed for constrained problems.

http://scialert.net/abstract/?doi=jse.2014.169.183 2014/05/08 - 18:01

Authentication is a very significant demand in wireless sensor networks, especially in some critical applications. However, most previous user authentication schemes are always vulnerable and high-power consumption for the resource-constrained WSNs nodes. This study will focus on user authentication by investigating the Park and others’ schemes to identify their demerits. After that, a novel and lightweight mutual user authentication protocol named MUAP is proposed. The analysis and results show that the proposed scheme not only can resist the specific attacks likes Man-In-The-Middle Attack, Impersonation Attack and Message-Alteration Attack but also is better than Kumar and others’ protocols in terms of devices’ computation overhead and communication consumption.

http://scialert.net/abstract/?doi=jse.2014.184.193 2014/05/08 - 18:01

In this study, we proposed a mutual authentication protocol between sensor nodes and gateway nodes. In our scheme, we adopted multiple security techniques against different security threats, such as time stamps against replay attacks, the ZUC encryption algorithm against data eaves dropping and unauthorized falsification. To verify the effectiveness and efficiency of our scheme, we implemented a telosb-based wireless sensor test bed, in which all sensor nodes were equipped with our security scheme to facilitate mutual authentications between each other. The experiment results reveal that, our proposal can not only resist the common attacks, such as the impersonation attack, replay attack and DoS attacks, but show a sound network performance.

http://scialert.net/abstract/?doi=jse.2014.194.202 2014/05/08 - 18:01

This study describes a highly available key-value store Dynamo system. The
system is the part of Amazon’s core business which provides “always-on”
support services. Dynamo’s main advantage is that it is a fully distributed,
no central node of the system and can improve the performance of the system.
It provides three parameters (N, R, W) whose value can be set according to users’
needs. It has the full membership mode, where each node knows its peer bearer
data. In order to provide this service, the Dynamo system using multiple versions
of data and conflict resolutions of the application support solves the data
consistency eventually.

http://scialert.net/abstract/?doi=jse.2014.203.210 2014/05/08 - 18:01

Face feature extraction is a key technology for face recognition. A new framework
of feature extraction is proposed in this study. Use wavelet and Gabor transformation
with maximum margin criterion to extract face features for a single training
sample. The less data of face features make it possible to transmit those data
to servers as quickly as possible. The results conducted on ORL database show
that the proposed method improves the performance, simultaneously obtains less
data to transmit.

http://scialert.net/abstract/?doi=jse.2014.211.218 2014/05/08 - 18:01

In this study, one echo signal simulation platform of radio detectors is constructed to obtain the echo data of the possible targets, which would be used as the test data in their design and test. Based on the MATLAB Simulink, the echo evaluation module is programmed by C++ language and the 3ds Max Model is as the scene model that could be updated according to the new positions of the possible targets, surroundings and radio detector themselves. Furthermore, the VRML display ActiveX control is also used as the dynamic display block. One space scene model is set up with 3ds MAX and used to verify the simulated echo signals which has the consistent characteristic as that with the received test data in fields.

http://scialert.net/abstract/?doi=jse.2014.219.224 2014/05/08 - 18:01

With the growth of internet, web applications have become very popular and used in every environment like medical, financial and military. But in the race to develop these online services, web applications have been developed and deployed with minimal attention given to security risks which leads to vulnerabilities in web application. Developers are mandated to deliver functionality on time and on budget but not to develop secure web applications, resulting in development of vulnerable web applications. Removing vulnerabilities after development wastes cost as well as time. So, why not Security is implemented throughout software development lifecycle it will save time and cost. Thousands of vulnerabilities are there in existing web application but this study focused on input validation vulnerabilities i.e., SQL injection and Cross Site Scripting (XSS), as they are more prevalent and have high risk. A brief introduction of web application vulnerabilities is discussed in this study. How cross site scripting and SQL injection vulnerabilities are addressed throughout the software development lifecycle is discussed. Different activities to be performed to mitigate them are suggested.

http://scialert.net/abstract/?doi=jse.2014.116.126 2014/04/14 - 09:14

This study begins with a review on the research status of search engine, followed by discussion on goals of search engine and then the principle of distributed computing is explained. Consequently the MapReduce distributed computing model and the Hadoop Distributed File System (HDFS) are analyzed in detail. Finally the distributed search engine architecture is presented. On the basis of the architecture, future challenges and opportunities of the distributed search engine are highlighted.

http://scialert.net/abstract/?doi=jse.2014.127.131 2014/04/14 - 09:14

The overall goal of our approach is to relate models of a given domain that
are created by different actors and thus are generally heterogeneous that is,
described in different DSL (Domain Specific Languages). Instead of building
a single global model, we propose to organize the different source models as
a network of models which provides a global view of the system through a virtual
global model. The matching of these models is done in a shared model of correspondences.
We focus in this study on the elaboration of the model of correspondences, through
a transformation called “refine”.
The approach is illustrated by a representative use case (a Bug Tracking System)
and supported by a modeling tool called HMS (Heterogeneous Matching Suite).

http://scialert.net/abstract/?doi=jse.2014.132.151 2014/04/14 - 09:14

Trustworthy Service Flow (TSF) is one of the most representative paradigm for trustworthy Software. Because TSF involves the non-functional attribute, such as quality of service and trustworthiness, the modelling and verification of which is very difficult. It is not sufficient to meet the TSF requirements just using the classical model checking. This paper propose a method to quantitative verification of TSF with stochastic model checking. Firstly, based on the extension of OWL-S upper ontology, the formal semantics for TSF is presented by Nondeterministic Probabilistic Petri Net (NPPN) and the translation process of which is implemented automatically. Moreover, the simplification rules for NPPN and is also presented. Then, the new temporal logic as PCTL is proposed for specifying the properties of TSF and the stochastic model checking algorithm is put forward for quantitative verification TSF. The feasibility and effectiveness of the method are illustrated by the available service flow case.

http://scialert.net/abstract/?doi=jse.2014.152.168 2014/04/14 - 09:14

Business Process Modeling Notation (BPMN) is the most influential graphical
modeling notation in service composition aspect and has been widely used in
modeling the Web service composition system, however BPMN lacks of formal semantics
and can not be verified formally and automatically. Extended object Petri net
(EOPN for short) is presented in order to model and verify BPMN process formally.
Guard, flow valve and time constraint are introduced into EOPN. Because state
class method for analyzing time Petri net is destitute of global temporal constraints,
timestamp state class method is developed and the corresponding analysis approach
is also presented. The enabled conditions of transition are listed and the firing
condition and rules of transition of EOPN are discussed in detail. Mapping rules
from BPMN to EOPN are depicted in detail. An example is provided for illustrating
the feasibility of mapping BPMN process diagram to EOPN model and verifying
its rightness formally.

http://scialert.net/abstract/?doi=jse.2014.58.74 2014/03/08 - 14:00

In Aspect Oriented Software Development, aspects are not only used at the programming
level but also tend to arise at the requirements analysis and software architecture
design. We previously proposed an approach named AspeCiS (An aspect-oriented
Approach to Develop a Cooperative Information System) to develop a Cooperative
Information System from existing Information Systems by using their artifacts
such as existing requirements and design elements. This approach include an
important step in which the aspectual requirements composition problem is considered
to be one of the remaining challenges. So, when multiple aspectual requirements
share the same join point, undesired behavior may emerge and a conflict resolution
process must be triggered. This study presents a conflict resolution process
among aspects during the requirements engineering level: A priority value is
computed for each aspect and it allows identifying a dominant aspectual requirement
on the basis of stakeholder priority. This process is more formal than those
currently proposed, which requires a trade-off negotiation to resolve conflicts.

http://scialert.net/abstract/?doi=jse.2014.75.88 2014/03/08 - 14:00

The open programmable network element architecture which based on Forwarding
Element (FE) and Control Element (CE) Separation (ForCES) is an important trend
of the next generation network element. How to realize the high availability
and flexibility of the next generation network has presently become the important
research work. This study studied on the high-availability requirement of CE
in detail, including some rules and methods of ensuring backup data and brought
forward the detection mechanism based on the heartbeat which is benefit for
finding out the CE fault. In brief, the contributions are to (a) Analyze and
improve the task of taking over methods between the CEs, (b) Realize this high-availability
overall software architecture on the basis of analysis and studies of above
techniques and (c) Test the high-availability of ForCES CE and the effectiveness
of methods is finally validated by simulations.

http://scialert.net/abstract/?doi=jse.2014.89.99 2014/03/08 - 14:00

A service discovery algorithm based on service clusters is proposed in this
study. By using ontology and Petri nets, the formal definition of services is
given and service clusters are formally represented as service cluster net units.
The usefulness of the algorithm is demonstrated by an experiment.

http://scialert.net/abstract/?doi=jse.2014.100.107 2014/03/08 - 14:00

Web services are widely accepted and used in the e-commerce. Trust plays an
important role in selecting one Web service for application among many services
satisfying the demand of requesters and trust for Web services is a hot topic
in research fields. This study proposes a method to calculate recommendation
trust of Web services and the weight of every recommender is confirmed by norm
grey correlation analysis method. The detailed process of the given method is
revealed by a specific instance. The method avoids the vicious recommendation
and improves the reliability of selected services.

http://scialert.net/abstract/?doi=jse.2014.108.115 2014/03/08 - 14:00

The simulative software is the important tool for demonstrating the performances
of the P-CDN. Accompanying with the increasing complexity of the P-CDN, extendibility
and distributed processing based on multiple machines become the basic requirements
of the simulation of the P-CDN. In this study, the simulative software architecture
with hierarchical extendable interfaces is put forward to meet the need of diverse
extending. Meanwhile, the multiple machines based distributed processing is
implemented with the Markov Chain model involved to balance the loading of the
machines. Experimental results indicate the proposed software architecture can
achieve better performance even when the scale of peers is larger.

http://scialert.net/abstract/?doi=jse.2013.121.132 2013/09/06 - 23:06

To quickly locate an appropriate web service from the large count of services
on the Internet, a cluster method is adopted to manage services. The services
with similar functions are defined as a service cluster and then the architecture
oriented service cluster and service binding algorithm are presented. However,
the granularity of service clusters impacts greatly on the efficiency of service
discovery and aggregation. A three-dimensional granularity division method for
service clusters is proposed in this study. It includes structure granularity,
quantity granularity and quality granularity. And each kind of the granularity
are divided into three levels, the division rules for each level are also presented.
Simulation experiments are provided to illustrate the rationality of the proposed
granularity division method.

http://scialert.net/abstract/?doi=jse.2013.133.141 2013/09/06 - 23:06

The explosive growth of mobile apps in recent years makes it much more difficult for users to find out interesting apps. For this reason, online app markets, e.g., the Google Play market, have employed recommender systems. Such systems construct recommending networks of mobile apps so that they alleviate the challenge of app discovery. However, research efforts on the recommender systems are mainly focusing on the improvement of recommending accuracy. Little attention has been paid to measure and optimize the navigating effects of the recommending networks. To be specified, rare works in the literature have focused on advancing the efficiency of helping users explore more apps while discovering them with fewer jumps. This study therefore initially addresses and formulates such a problem. It further proposes to reconstruct the recommending networks after they have been formed by the recommender systems. Since mobile apps in the online markets have constituted complex networks, this study designs reconstructing schemes leveraging the complex network metrics and methods. Particularly, based on specific complex network measurements, e.g., the number of SCCs (strongly connected components), the APL (Average path length) and the node centrality, this study proposes two reconstructing schemes. After all, real-data evaluations have verified the effectiveness of the schemes proposed by this study.

http://scialert.net/abstract/?doi=jse.2013.142.150 2013/09/06 - 23:06

Innovation is considered as one of the reinforcing agents in organizations. Software engineering teams are composed of different people with different skills. These teams can make use of innovation. The aims of using innovation in these teams are of three types: competition in market; production of goods and general or other goals. This research attempts to examine different viewpoints and offer innovation as an opportunity in software teams. However, if not controlled, innovation can be a real challenge for these teams. This research try to answer the question that whether innovation is an opportunity or a challenge in software development. As the research indicates, innovation as an opportunity in software development teams was accepted and if controlled and managed correctly, it can bring about many advantages for the team and the organization.

http://scialert.net/abstract/?doi=jse.2013.151.155 2013/09/06 - 23:06

Source code contain lot of structural features that embody latent information
that if identified can help software engineers to develop quality software in
least amount of time. For instance, many programming rules are hidden in set
of function calls, variable usage, data accesses in functions, object interaction
etc. that seldom exist outside the minds of developers. Violations of these
rules may introduce bugs which are difficult to uncover, report to bug-tracking
systems and fix unless the rules are explicitly documented and made available
to the development team. In order to address this problem there is a need to
apply strong analysis techniques on source code to find latent programming patterns
that can be potentially useful for performing various software engineering tasks.
This study demonstrates how data mining techniques can be applied on source
code to improve software quality and productivity by proposing a framework.
This new approach is able to find different programming patterns such as programming
rules, variable correlation, code clones and frequent API usage patterns. Furthermore,
efficient algorithms are proposed to automatically detect violation to the extracted
rules. Proposed framework is validated by developing a prototype and evaluated
on various projects of significant size and complexity. Results shows proposed
technique greatly reduced time and cost of manually checking defects from source
code by programmers.

http://scialert.net/abstract/?doi=jse.2013.86.105 2013/06/21 - 13:57

It was well known that CT (Coiled tubing) worked under severely plastic deformation and was used widely in the oil field with its special advantage. Under the interaction of bending, axial loads and internal pressure, especially over 20 MPa the growth of diameter was quick and the deformation of the pipe was serious which would lead to the incompatibility failure, or the accidents resulting from the mechanism declining. So the study on the sectional deformation of the pipe during severely plastic deformation was very important for the safety operation. In this study, the full-scale fatigue experiment was subjected on the fatigue machine of CT under different internal pressure and the testing experiment of diameter was carried on. The variation and distribution of the growth of the diameter to CT along the bending board were discussed which would provide reliable basis for judging the dangerous place and the stress concentration area. The laws of the growth ratio of the maximum diameter during the fatigue deformation were found which could evaluate the degree of the deformation. And the real-time monitoring of diameter could predict the remaining life.

http://scialert.net/abstract/?doi=jse.2013.106.113 2013/06/21 - 13:57

Software defect prediction is an important approach that helps practitioners to manage software projects effectively. A large numbers of the existing models are based on program metrics. But how much exactly these metrics can account for defect rate still remains a controversial problem. As is known to all, programming is a knowledge-intensive activity. The content of task and the expertise of individuals influence defect rates a lot. Thus, if these interference factors are not well controlled, it's hard to draw any conclusions on how program metrics affect defect rate alone. There is extremely limited evidence produced by controlled experiments on this problem. This study bridges the gap by conducting a controlled experiment, in which the programming problem is solved by subjects with the same background of academic performance, programming training and programming experience. Fifty four subjects participated in the experiment, with 51 different versions of programs produced in the same language C. The results demonstrate that program metrics can only account for 27.6% variability of defect rate.

http://scialert.net/abstract/?doi=jse.2013.114.120 2013/06/21 - 13:57

SLAM (Simultaneous Localization and Mapping) and path planning are two important research directions in the field of robotics. How to explore an entirely unknown dynamic environment efficiently is a difficult problem for intelligent mobile robots. In this study, a new method of information fusion i.e. DSmT (Dezert-Smarandache Theory) which is an extension of DST (Dempster-Shafer Theory) is introduced to deal with high conflicting and uncertain information and then multi-agent robot system with GREM (Generalized Evidence Reasoning Machine) based on DSmT is presented for mobile robot’s SLAM and efficiently planning smooth paths in unknown dynamic environment. The single robot is treated as a multi-agent system and the corresponding architecture combined with cooperative control is constructed. Considering the characteristics of sonar sensor, the grid map method is adopted and a sonar sensor mathematical model is constructed based on DSmT. Meanwhile a few of gbbaf (general basic belief assignment functions) are constructed for fusion. In order to make the A* algorithm which is the classical method for the global path planning suitable for local path planning, safety guard district search method and an optimizing approach for searched paths are proposed. Finally, SLAM and path planning experiments are carried out with Pioneer 2-DXe mobile robot. The experimental results testify the validity of hybrid DSm (Dezert-Smarandache) model under DSmT framework for fusing imprecise information during map building and also reveal the validity and superiority of the multi-agent system for path planning in unknown dynamic environment.

http://scialert.net/abstract/?doi=jse.2013.46.67 2013/04/30 - 17:57