Skip to Content

Instrukcja korzystania z Biblioteki


Ukryty Internet | Wyszukiwarki specjalistyczne tekstów i źródeł naukowych | Translatory online | Encyklopedie i słowniki online


Astronomia Astrofizyka

Sztuka dawna i współczesna, muzea i kolekcje

Metodologia nauk, Matematyka, Filozofia, Miary i wagi, Pomiary

Substancje, reakcje, energia
Fizyka, chemia i inżynieria materiałowa

Antropologia kulturowa Socjologia Psychologia Zdrowie i medycyna

Przewidywania Kosmologia Religie Ideologia Polityka

Geologia, geofizyka, geochemia, środowisko przyrodnicze

Biologia, biologia molekularna i genetyka

Technologia cyberprzestrzeni, cyberkultura, media i komunikacja

Wiadomości | Gospodarka, biznes, zarządzanie, ekonomia

Budownictwo, energetyka, transport, wytwarzanie, technologie informacyjne

Pesquisa Operacional

Scielo Brazylia

Random key genetic algorithms are heuristic methods for solving combinatorial optimization problems. They represent solutions as vectors of randomly generated real numbers, the so-called random keys. A deterministic algorithm, called a decoder, takes as input a vector of random keys and associates with it a feasible solution of the combinatorial optimization problem for which an objective value or fitness can be computed. We compare three types of random-key genetic algorithms: the unbiased algorithm of Bean (1994); the biased algorithm of Gonçalves and Resende (2010); and a greedy version of Bean's algorithm on 12 instances from four types of covering problems: general-cost set covering, Steiner triple covering, general-cost set k -covering, and unit-cost covering by pairs. Experiments are run to construct runtime distributions for 36 heuristic/instance pairs. For all pairs of heuristics, we compute probabilities that one heuristic is faster than the other on all 12 instances. The experiments show that, in 11 of the 12 instances, the greedy version of Bean's algorithm is faster than Bean's original method and that the biased variant is faster than both variants of Bean's algorithm. 2014/08/23 - 18:13

This paper investigates the one-dimensional cutting stock problem considering two conflicting objective functions: minimization of both the number of objects and the number of different cutting patterns used. A new heuristic method based on the concepts of genetic algorithms is proposed to solve the problem. This heuristic is empirically analyzed by solving randomly generated instances and also practical instances from a chemical-fiber company. The computational results show that the method is efficient and obtains positive results when compared to other methods from the literature. 2014/08/23 - 18:13

Keen competition and increasingly demanding customers have forced companies to use their resources more efficiently and to integrate production and transportation planning. In the last few years more and more researchers have also focused on this challenging problem by trying to determine the complexity of the individual problems and then developing fast and robust algorithms to solve them. This paper reviews existing literature on integrated production and distribution decisions at the tactical and operational level, where the distribution part is modelled as some variation of the well-known Vehicle Routing Problem (VRP). The focus is thereby on problems that explicitly consider deliveries to multiple clients in a less-than-truckload fashion. In terms of the production decisions we distinguish in our review between tactical and operational production problems by considering lot-sizing/capacity allocation and scheduling models, respectively. 2014/08/23 - 18:13

This paper analyzes defense systems taking into account the strategic interactions between two rational agents; one of them is interested in designing a defense system against purposeful attacks of the other. The interaction is characterized by a sequential game with perfect and complete information. Reliability plays a fundamental role in both defining agents' actions and in measuring performance of the defense system for which a series-parallel configuration is set up by the defender. The attacker, in turn, focuses on only one defense subsystem in order to maximize her efficiency in attacking. An algorithm involving backward induction is developed to determine the equilibrium paths of the game. Application examples are also provided. 2014/08/23 - 18:13

Traditional GARCH models fail to explain at least two of the stylized facts found in financial series: the asymmetry of the distribution of errors and the leverage effect. The leverage effect stems from the fact that losses have a greater influence on future volatilities than do gains. Asymmetry means that the distribution of losses has a heavier tail than the distribution of gains. We test whether these features are present in some series related to the Brazilian market. To test for the presence of these features, the series were fitted by GARCH(1,1), TGARCH(1,1), EGARCH(1,1), and GJR-GARCH(1,1) models with standardized Student t distribution errors with and without asymmetry. Information criteria and statistical tests of the significance of the symmetry and leverage parameters are used to compare the models. The estimates of the VaR (value-at-risk) are also used in the comparison. The conclusion is that both stylized facts are present in some series, mostly simultaneously. 2014/08/23 - 18:13

Based on an agro-technical study for the mid-west region of Brazil, and considering financial conditions like monthly expenses and long-term investments, a mixed integer and dynamic linear model has been proposed for representing crop production systems. This model establishes a monthly dynamic treatment of production and financial activities over a long-term planning horizon for small and medium farm systems. In this paper, by considering more recent government financial policies for the Brazilian agricultural sector related to the Pronaf and Proger credit lines, a mathematical model is updated for distinct situations derived from the use of short and long-term loans which were defined for small and medium farmers. In this way, new versions of the original model are obtained by separately implementing into the production systems economic and financial conditions of credit lines for the years 2006 and 2009. Computational tests are performed and the results obtained are presented in several scenarios. Also, an evolutionary analysis on the socio-economic and financial feasibility of the agricultural farm system is drawn over the last decade by comparing the results obtained to one known from the year 2002. 2014/08/23 - 18:13

This paper presents a multicriteria decision model for selecting a portfolio of information system (IS) projects, which integrates strategic and organizational view within a multicriteria decision structure. The PROMETHEE V method, based on outranking relations is applied, considering the c-optimal concept in order to overcome some scaling problems found in the classical PROMETHEE V approach. Then, a procedure is proposed in order to make a final analysis of the c-optimal portfolios found as a result of using PROMETHEE V. Also, the organizational view is discussed, including some factors that may influence decision making on IS projects to be included in the portfolio, such as adding the company's strategic vision and technical aspects that demonstrate how IS contributes value to a company's business. 2014/08/23 - 18:13

The area of Guaratiba, in Rio de Janeiro, presents extraordinary population growth rates that exceed all other districts of the city. Moreover, the public investments underway, in view of the 2106 Olympic Games, are making the region even more attractive. Therefore, it is appropriate to suggest proactive measures to avoid the predicted collapse of several public systems among them the education system. This paper considers the projected population for the years 2015 and 2020 and, using various computing resources, specially the ArcGIS Network Analyst tool for measuring traveled distances, proposes locating new facilities with the Capacitated p-Median Model and with the Maximum Covering Location Problem, considering an ideal maximal home-school distance of 1,500 meters, but also evaluating longer distances. Both problems have been solved with AIMMS. The consideration of both models provides a constructive insight that certainly improves the implemented solution and favors the local community. 2014/08/23 - 18:13

Market players' investment decisions sometimes surprise analysts, especially when projects that are less feasible in financial terms enter first in the market, before more viable projects. One possible explanation is that firms have different expectations concerning the future of the market. In this article we use the Option-Games approach for asymmetric duopolies to analyze investors' decisions in the first auction for wind power in Brazil, held in 2009, in which some less viable firms pushed more viable firms out of the auction. Our analysis shows that even small differences in the investors' views can yield this unexpected result. When uncertainty is low and expectations are symmetric, the outcome is a lower energy tariff as well as a stronger wind industry in Brazil, highlighting the importance of a clear and credible long term governmental policy, not only for the wind industry, but also for any other industry in its early stages. 2014/08/23 - 18:13

One of the most important aspects for companies' success is the relationship between companies and their suppliers. Consequently, the way that a supplier is selected is crucial to the outcome of the business. Thus, we propose a multicriteria decision support model with two phases: the analysis of the products/services from suppliers that need to be evaluated, using PROMSORT, and the analysis of the suppliers of such products/services which is considered critical, using PROMETHEE II. The model was applied to a Distribution Center of an important Brazilian retailer which serves stores in the North and Northeast regions of Brazil. Using the proposed model, companies can focus their attention on those products or services that have the greatest impact on their business results. The model predicts that different decision-making processes should be applied, in accordance with the class of importance into which the products or services are classified. 2014/08/23 - 18:13

Different approaches for deploying resilient optical networks of low cost constitute a traditional group of NP-Hard problems that have been widely studied. Most of them are based on the construction of low cost networks that fulfill connectivity constraints. However, recent trends to virtualize optical networks over the legacy fiber infrastructure, modified the nature of network design problems and turned inappropriate many of these models and algorithms. In this paper we study a design problem arising from the deployment of an IP/MPLS network over an existing DWDM infrastructure. Besides cost and resiliency, this problem integrates traffic and capacity constraints. We present: an integer programming formulation for the problem, theoretical results, and describe how several metaheuristics were applied in order to find good quality solutions, for a real application case of a telecommunications company. 2014/05/14 - 15:41

We present the transcript of the IFORS distinguished lecture delivered by the author on invitation of SOBRAPO and IFORS. The lecture concerned the development of an interdisciplinary research motivated by an application in mobile telecommunication systems, a project jointly developed by four research teams. The presentation follows the typical steps of a classical operations research study, and aims at reviewing the main theoretical and practical results that were obtained. 2014/05/14 - 15:41

This work aims at complementing the development of the EFM (Ellipsoidal Frontier Model) proposed by Milioni et al. (2011a). EFM is a parametric input allocation model of constant sum that uses DEA (Data Envelopment Analysis) concepts and ensures a solution such that all DMUs (Decision Making Units) are strongly CCR (Constant Returns to Scale) efficient. The degrees of freedom obtained with the possibility of assigning different values to the ellipsoidal eccentricities bring flexibility to the model and raises the interest in evaluating the best distribution among the many that can be generated. We propose two analyses named as local and global. In the first one, we aim at finding a solution that assigns the smallest possible input value to a specified DMU. In the second, we look for a solution that assures the lowest data variability. 2014/05/14 - 15:41

The 0-1 exact k-item quadratic knapsack problem (E - kQKP) consists of maximizing a quadratic function subject to two linear constraints: the first one is the classical linear capacity constraint; the second one is an equality cardinality constraint on the number of items in the knapsack. Most instances of this NP-hard problem with more than forty variables cannot be solved within one hour by a commercial software such as CPLEX 12.1. We propose therefore a fast and efficient heuristic method which produces both good lower and upper bounds on the value of the problem in reasonable time. Specifically, it integrates a primal heuristic and a semidefinite programming reduction phase within a surrogate dual heuristic. A large computational experiments over randomly generated instances with up to 200 variables validates the relevance of the bounds produced by our hybrid dual heuristic, which yields known optima (and prove optimality) in 90% (resp. 76%) within 100 seconds on the average. 2014/05/14 - 15:41

This paper shows a method for solving linear programming problems that includes Interval Type-2 fuzzy constraints. The proposed method finds an optimal solution in these conditions using convex optimization techniques. Some feasibility conditions are presented, and some interpretation issues are discussed. An introductory example is solved using the proposed method, and its results are described and discussed. 2014/05/14 - 15:41

This paper describes an application in group decision making, aimed at developing a procedure to help define priorities in preventive maintenance activities. The method applied is called DRV Processes (Decision with Reduction of Variability) and it combines both statistical techniques and multicriteria decision aid procedures. Among its advantages, we may highlight the possibility of reducing the noise affecting information in group decision making and of reaching a consensual decision. This approach generally improves the level of shared knowledge and helps to avoid conflict within the group. The application was carried out in a major pharmaceutical production plant. The experience showed an eighty per cent reduction in the original amount of process noise. Moreover, the paper describes evidence of improvement in interpersonal relationships. 2014/05/14 - 15:41

Two graph classes are presented; the first one (k-ribbon) generalizes the path graph and the second one (k-fan) generalizes the fan graph. We prove that they are subclasses of chordal graphs and so they share the same structural properties of this class. The solution of two problems are presented: the determination of the subchromatic number and the determination of the toughness. It is shown that the elements of the new classes establish bounds for the toughness of k-path graphs. 2014/05/14 - 15:41

In this paper, we investigate the separation problem on some valid inequalities for the s - t elementary shortest path problem in digraphs containing negative directed cycles. As we will see, these inequalities depend to a given parameter k ∈ ℕ. To show the NP-hardness of the separation problem of these valid inequalities, considering the parameter k ∈ ℕ, we establish a polynomial reduction from the problem of the existence of k + 2 vertex-disjoint paths between k + 2 pairs of vertices (s1, t1), (s2, t2) ... (sk+2, t k+2) in a digraph to the decision problem associated to the separation of these valid inequalities. Through some illustrative instances, we exhibit the evoked polynomial reduction in the cases k = 0 and k = 1. 2014/05/14 - 15:41

This work deals with a project scheduling problem where the tasks consume resources to be activated, but start to produce them after that. This problem is known as Dynamic Resource-Constrained Project Scheduling Problem (DRCPSP). Three methods were proposed to divide the problem into smaller parts and solve them separately. Each partial solution is obtained by CPLEX optimizer and is used to generate more complete partial solutions. The obtained results show that this hybrid method works very well. 2014/05/14 - 15:41

In this paper, we consider an OEM selling new products to the market offering (i) a warranty period during which defective units are dealt with at no cost for the customer, and (ii) a full refund to customers who return products that do not meet their expectations (consumer returns). The manufacturer has different options for satisfying the warranty cases as well as for utilizing the consumer returns. Warranty cases could be dealt with by repairing the defective units, replacing them with new products, or replacing them with refurbished consumer returns. Alternatively leftover new products or consumer returns can also be sold on a secondary market. We develop a model and derive the OEMs optimal decisions with respect to these options under demand uncertainty on the primary market. 2013/12/21 - 10:19

In an electric power systems planning framework, decomposition techniques are usually applied to separate investment and operation subproblems to take benefits from the use of independent solution algorithms. Real power systems planning problems can be rather complex and their detailed representation often leads to greater effort to solve the operation subproblems. Traditionally, the algorithms used in the solution of transmission constrained operation problems take great computational advantage with compact representation of the model, which means the elimination of some variables and constraints that don't affect the problem's optimal solution. This work presents a new methodology for solving generation and transmission expansion planning problems based on Benders decomposition where the incorporation of the traditional operation models require an additional procedure for evaluating the Lagrange's multipliers associated to the constraints which are not explicitly represented yet are used in the construction of the Benders cuts during the iterative process. The objective of this work is to seek for efficiency and consistency in the solution of expansion planning problems by allowing specialized algorithms to be applied in the operation model. It is shown that this methodology is particularly interesting when applied to stochastic hydrothermal problems which usually require a large number of problems to be solved. The results of this methodology are illustrated by a Colombian system case study. 2013/12/21 - 10:19

The objective of this paper is to verify the robustness of the Least Square Monte Carlo and Grant, Vora & Weeks methods when used to determine the incremental payoff of the carbon market for renewable electricity generation projects, considering that the behavior of the price of Certified Emission Reductions, otherwise known as Carbon Credits, may be modeled using a jump-diffusion process. In addition, this paper analyses particular characteristics, such as absence of monotonicity, found in trigger curves obtained through use of the Grant, Vora & Weeks method to valuate these types of project. 2013/12/21 - 10:19

A graph is regular if every vertex is of the same degree. Otherwise, it is an irregular graph. Although there is a vast literature devoted to regular graphs, only a few papers approach the irregular ones. We have found four distinct graph invariants used to measure the irregularity of a graph. All of them are determined through either the average or the variance of the vertex degrees. Among them there is the index of the graph, a spectral parameter, which is given as a function of the maximum eigenvalue of its adjacency matrix. In this paper, we survey these invariants with highlight to their respective properties, especially those relative to extremal graphs. Finally, we determine the maximum values of those measures and characterize their extremal graphs in some special classes. 2013/12/21 - 10:19

In this article we consider some properties of concern for research production at Embrapa. We apply statistical tests to address questions related to the scale of operation, the presence of allocative inefficiencies and separability of inputs and outputs. The production process is assessed by nonparametric methods with the use of Data Envelopment Analysis. The period under analysis is 2002-2009. We conclude that Embrapa's technology frontier shows variable returns to scale, is allocative efficient in general and is separable in inputs and outputs. These characteristics justify the company policy of adopting a VRS solution and the aggregation of output variables. Scale inefficiencies are the basis for further input congestion studies. 2013/12/21 - 10:19

As well as exploratory activity being at the heart of and guiding the future of the oil industry, it is fundamental that there be a comprehensive analysis covering the various factors and nuances that arise in the selection of exploration projects. Moreover, it is essential that a decision model enables the decisionmaker's preferences to be addressed in a structured (and methodologically correct) way, and one which is easy to understand and to apply in a real-world. Therefore, this paper proposes a multicriteria decision model which underpins using a deterministic procedure for selecting a portfolio of oil and gas exploration projects and thereafter a reality-based application is set out, based on a decision making context within Petrobras. 2013/12/21 - 10:19

This work analyses the performance of three different population-based metaheuristic approaches applied to Fuzzy cognitive maps (FCM) learning in qualitative control of processes. Fuzzy cognitive maps permit to include the previous specialist knowledge in the control rule. Particularly, Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and an Ant Colony Optimization (ACO) are considered for obtaining appropriate weight matrices for learning the FCM. A statistical convergence analysis within 10000 simulations of each algorithm is presented. In order to validate the proposed approach, two industrial control process problems previously described in the literature are considered in this work. 2013/12/21 - 10:19

In this article, we analyze how reasonable it is to play according to some Nash equilibria if players have a preference for one of their opponents' strategies. To formalize our argument, we propose the concepts of collaborative dominance and collaborative equilibrium studying some of its properties. First, we prove that if the collaborative equilibrium exists, then it is always efficient, what can be seen as a focal property. Moreover, we argue that one reason that may lead players not to collaborate is that they can focus on security instead of efficiency properties, in which case they would prefer to play according to maximin strategies. This argument allows us to reduce the hall of reasonable equilibria for games where a collaborative equilibrium exists. Finally, we show that two-player zero-sum games do not have collaborative equilibrium, and that if they contain a strategy profile composed only of collaboratively dominated actions, then such profile is a Nash equilibrium of the game. 2013/12/21 - 10:19

The hub-and-spoke network design problem, also known as the hub location problem, aims to find the concentration points in a given network flow so that the sum of the distances of the linkages is minimized. In this work, we compare discrete solutions of this problem, given by the branch-and-cut method applied to the p-hub median model, with continuous solutions, given by the hyperbolic smoothing technique applied to a min-sum-min model. Computational experiments for particular instances of the Brazilian air transportation system, with the number of hubs varying from 2 to 8, are conducted with the support of a discretization heuristic and the Voronoi diagram. 2013/12/21 - 10:19

Any interaction involving individuals, whose objectives are conflicting with each other, may establish a negotiation process. In a negotiation, each party should develop his/her own strategy and, normally, a win-lose vision is frequently adopted. The main consequence of this behavior is a result, in which both parties lose, especially when the negotiation involves more than one aspect, such as negotiations resulting from purchases of material for construction industry, where aspects like price, quality and lead-time should be considered. Most of the negotiation involving construction industry adopts a win-lose vision; and, commonly, only the issue price is considered. The goal of this paper is to propose a framework to support negotiations between two parties (buyer and seller) in the supply chain of construction industry. The combination of a win-win strategy with a multicriteria analysis produces a best compromise solution for both parties. A simulation of negotiation using realistic data is presented. 2013/12/21 - 10:19

Even though the body of literature in the area of cutting and packing is growing rapidly, research seems to focus on standard problems in the first place, while practical aspects are less frequently dealt with. This is particularly true for setup processes which arise in industrial cutting processes whenever a new cutting pattern is started (i.e. a pattern is different from its predecessor) and the cutting equipment has to be prepared in order to meet the technological requirements of the new pattern. Setups involve the consumption of resources and the loss of production time capacity. Therefore, consequences of this kind must explicitly be taken into account for the planning and control of industrial cutting processes. This results in extensions to traditional models which will be reviewed here. We show how setups can be represented in such models, and we report on the algorithms which have been suggested for the determination of solutions of the respective models. We discuss the value of these approaches and finally point out potential directions of future research. 2013/08/13 - 21:08

To each instance of the Quadratic Assignment Problem (QAP) a relaxed instance can be associated. Both variances of their solution values can be calculated in polynomial time. The graph isomorphism problem (GIP) can be modeled as a QAP, associating its pair of data matrices with a pair of graphs of the same order and size. We look for invariant edge weight functions for the graphs composing the instances in order to try to find quantitative differences between variances that could be associated with the absence of isomorphism. This technique is sensitive enough to show the effect of a single edge exchange between two regular graphs of up to 3,000 vertices and 300,000 edges with degrees up to 200. Planar graph pairs from a dense family up to 300,000 vertices were also discriminated. We conjecture the existence of functions able to discriminate non-isomorphic pairs for every instance of the problem. 2013/08/13 - 21:08

This paper presents the statistical modeling for daily counting statistics of units that arrive for quality inspection at a food company. Different Poisson regression models were considered in order to analyze the data collected, with a Bayesian focus. The main objective was to forecast the daily average count based on co-variables such as days of the week. The analysis of co-variables is very often neglected by statistical packages that come with Discrete Event Simulation software. The discovery of the factors that influence these variations was essential to a more accurate modeling (the definition of simulation calendars) and enables industrial managers to make better decisions about the reallocation of people in the department, resulting in better planning of production capacity. 2013/08/13 - 21:08

In this work, we study coerciveness notions and their implications to existence conditions. We start with the presentation of classical ideas of coerciveness in the framework of Optimization Theory, and, then, using a classical technical result, introduced by Ky Fan in 1961, we extend these ideas first to Optimization Problems and then to Equilibrium Problems. We point out the importance of related conditions to the introduced coerciveness notion in order to obtain existence results for Equilibrium Problems, without using monotonicity or generalized monotonicity assumptions. 2013/08/13 - 21:08

This paper describes an exact algorithm to solve a nonlinear mixed-integer programming model due to capacity expansion and flow assignment in multicommodity networks. The model combines continuous multicommodity flow variables associated with nonlinear congestion costs and discrete decision variables associated with the arc expansion costs. After establishing precise correspondences between a mixed-integer model and a continuous but nonconvex model, an implicit enumeration approach is derived based on the convexification of the continuous objective function. Numerical experiments on medium size instances considering one level of expansion are presented. The results reported on the performance of the proposed algorithm show that the approach is efficient, as commercial solvers were not able to tackle the instances considered. 2013/08/13 - 21:08

This study analyzes extreme values in the daily returns of 45 Brazilian stocks between 2 January 1995 and 18 March 2009. The incidence of observations outside the range of three standard deviationsfrom the mean is at least five times greater than under the normal distribution. The occurrence of extreme values in the upper tail is 1.13 times higher than in the lower. The average of the extreme positive returns is higher than that of extreme negative returns. Half percent of the days determined the outcome of the investment. Extreme values are at least ± 7%. Investors should assess whether they will keep their holdings when returns of such magnitude occur. The characteristics of empirical distributions of stock returns favor the passive investor and the use of weight constraints in portfolio allocation models. 2013/08/13 - 21:08

Setting out to solve operational problems is a frequent part of decision making on public safety. However, the pillars of tactics and strategy are normally disregarded. Thus, this paper focuses on a strategic issue, namely that of a city prioritizing areasin which there is a degree of occurrences for criminality to increase. A multiple criteria approach is taken. The reason for this is that such a situation is normally analyzed from the perspective of the degree of police occurrences. The proposed model is based on a SMARTS multicriteria method and was applied in a Brazilian City. It combines a multicriteria method and a Monte Carlo Simulation to support an analysis of robustness. As a result, we highlight some differences between the model developed and police occurrences model. It might support differentiated policies for zones, by indicating where there should be strong actions, infrastructure investments, monitoring procedures and others public safety policies. 2013/08/13 - 21:08

Nowadays the technique of radiotherapy has been one of the main alternatives for the treatment of several types of cancer today. With technological development, especially in the case of 3D conformal radiotherapy, applications involving mathematical techniques and algorithms have been proposed to help the development a good treatment plan. This paper aims at present a model for multiobjective linear programming problem of dose intensity. The focus of the model is to determine the best dose distribution of radiation field, so that the dose delivered to the tumor to be prescribed and that affects the minimum the noble and healthy tissues. A test case of prostate cancer was used as an example of the numerical model and the Pareto-Optimal Frontier was generated using the method of weighted function. 2013/08/13 - 21:08

Day after day, the importance of a company having an efficient storage location assignment system increases. Moreover, since products have different warehouse costs and customers' requirements are also different, it is important to sort products in order to adopt strategies for inventory management that are appropriate for each product. However, adopting a policy for each product is not applicable in the real world. Therefore, companies usually categorize products into classes and thereafter adopt specific inventory management policies. Given this situation, this paper puts forward the arguments for adopting a multi-criteria method, Electre TRI, to sort products that both considers criteria relating to the characteristics of a product as to its physical location in the warehouse and the criteria that are important for inventory strategies, such as, for example, the profitability of each unit held in storage. 2013/08/13 - 21:08

In order to model the preferences of a decision-maker (DM) by means of fuzzy preferencerelations, a DM can utilize different preference formats (such as ordering of the alternatives, utility values, multiplicative preference relations, fuzzy estimates, and reciprocal as well as nonreciprocal fuzzy preference relations) to express his/her judgments. Afterward, the obtained information is utilized to constructfuzzy preference relations. Here we introduce a procedure that allows the use of so-called preference functions (which is a preference format utilized in the methods of PROMETHEE family) to construct nonreciprocal fuzzy preference relations. With diverse preference formats being offered, a DM can select the onethat is the most convenient to articulate his/her preferences. In order to demonstrate the applicability of theproposed procedure a multicriteria decision-making problem related to the site selection for constructing anew hospital is considered here. 2013/08/13 - 21:08