首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Two new covering problems are introduced. The partial covering P-center problem minimizes a coverage distance in such a way that a given fraction of the population is covered. The partial set covering problem seeks the minimum number of facilities needed to cover an exogenously specified fraction of the population within a given coverage distance. The problems are formulated as integer linear programming problems. Bisection search algorithms are outlined for the two problems. The search algorithm repeatedly solves a Lagrangian relaxation of the maximal covering problem. Computational results for the Lagrangian relaxation of the maximal covering problem and for the bisection search algorithms are presented on problems with up to 150 nodes.  相似文献   

2.
Two new covering problems are introduced. The partial covering P-center problem minimizes a coverage distance in such a way that a given fraction of the population is covered. The partial set covering problem seeks the minimum number of facilities needed to cover an exogenously specified fraction of the population within a given coverage distance. The problems are formulated as integer linear programming problems. Bisection search algorithms are outlined for the two problems. The search algorithm repeatedly solves a Lagrangian relaxation of the maximal covering problem. Computational results for the Lagrangian relaxation of the maximal covering problem and for the bisection search algorithms are presented on problems with up to 150 nodes.  相似文献   

3.
ABSTRACT. In this paper we propose new algorithms for the solution of both general and standard spatial price equilibrium problems, and test their performance with existing algorithms on randomly generated problems. For the standard problem, we propose decomposition schemes based on the concept of “equilibration operator” and compare their performance with the Frank-Wolfe method. For the general problem, we present alternative variational inequality formulations defined over Cartesian products of sets and then exploit these formulations to construct Gauss-Seidel-type serial decomposition methods. We then compare their performance with the projection method. Our computational tests suggest that the new schemes are substantially more efficient than earlier ones.  相似文献   

4.
Abstract.  We investigate the ( r ∣ X p )‐medianoid problem for networks. This is a competitive location problem that consists of determining the locations of r facilities belonging to a firm in order to maximize its market share in a space where a competitor is already operating with p facilities. We consider six scenarios resulting from the combination of three customer choice rules (binary, partially binary, and proportional) with two types of services (essential and unessential).
Known discretization results about the existence of a solution for the set of nodes are extended. Some examples and computational experience using heuristic algorithms are presented.  相似文献   

5.
ABSTRACT. The spatial price equilibrium on a general network may be formulated as a nonlinear-cost mathematical programming problem with simple constraints, when the decision variables are the path flows. The solution of this problem is difficult due to the very large number of variables (paths) and the impracticality of generating all the paths from all the origins to all the destinations. In this paper, we develop a Gauss-Seidel-Newton Projection algorithm and combine it with a restriction strategy. That makes it unnecessary to generate a priori all the paths. This algorithm may be further improved by exploiting the equivalence between the spatial price equilibrium on a general network and the network equilibrium. Computational results that we present in this paper demonstrate the efficiency of the proposed solution algorithms.  相似文献   

6.
To date, aerial archaeologists generally apply simple rectification procedures or more expensive and time-consuming orthorectification algorithms to correct their aerial photographs in varying degrees for geometrical deformations induced by the topographical relief, the tilt of the camera axis and the distortion of the optics. Irrespective of the method applied, the georeferencing of the images is commonly determined with ground control points, whose measurement and identification is a time-consuming operation and often limits certain images from being accurately georeferenced. Moreover, specialised software, certain photogrammetric skills, and experience are required. Thanks to the recent advances in the fields of computer vision and photogrammetry as well as the improvements in processing power, it is currently possible to generate orthophotos of large, almost randomly collected aerial photographs in a straightforward and nearly automatic way. This paper presents a computer vision-based approach that is complemented by proven photogrammetric principles to generate orthophotos from a range of uncalibrated oblique and vertical aerial frame images. In a first phase, the method uses algorithms that automatically compute the viewpoint of each photograph as well as a sparse 3D geometric representation of the scene that is imaged. Afterwards, dense reconstruction algorithms are applied to yield a three-dimensional surface model. After georeferencing this model, it can be used to create any kind of orthophoto out of the initial aerial views. To prove the benefits of this approach in comparison to the most common ways of georeferencing aerial imagery, several archaeological case studies are presented. Not only will they showcase the easy workflow and accuracy of the results, but they will also prove that this approach moves beyond current restrictions due to its applicability to datasets that were previously thought to be unsuited for convenient georeferencing.  相似文献   

7.
The issue of reallocating population figures from a set of geographical units onto another set of units has received a great deal of attention in the literature. Every other day, a new algorithm is proposed, claiming that it outperforms competitor procedures. Unfortunately, when the new (usually more complex) methods are applied to a new data set, the improvements attained are sometimes just marginal. The relationship cost‐effectiveness of the solutions is case‐dependent. The majority of studies have focused on large areas with heterogeneous population density distributions. The general conclusion is that as a rule more sophisticated methods are worth the effort. It could be argued, however, that when we work with a variable that varies gradually in relatively homogeneous small units, simple areal weighting methods could be sufficient and that ancillary variables would produce marginal improvements. For the case of reallocating census data, our study shows that, even under the above conditions, the most sophisticated approaches clearly yield the better results. After testing fourteen methods in Barcelona (Spain), the best results are attained using as ancillary variable the total dwelling area in each residential building. Our study shows the 3‐D methods as generating the better outcomes followed by multiclass 2‐D procedures, binary 2‐D approaches and areal weighting and 1‐D algorithms. The point‐based interpolation procedures are by far the ones producing the worst estimates.  相似文献   

8.
This article presents the maximal covering problem on a network when some of the weights can be negative. Integer programming formulations are proposed and tested with ILOG CPLEX. Heuristic algorithms, an ascent algorithm, and simulated annealing are proposed and tested. The simulated annealing approach provides the best results for a data set comprising 40 problems.  相似文献   

9.
Several procedures, based upon cell count analysis, have been proposed for classifying spatial distributions, or maps, associated with some region R. Such procedures are rather imprecise and are known to depend upon the sixes and shapes of the cells in the particular partition of R under consideration. In this paper, the problem is considered from the point of view of hypothesis testing. A test of randomness based upon an arbitrary number of partitions of R is giuen. If the hypothesis of randomness is rejected, additional tests may be performed to classify the map into one of two categories, clustered or regular. These tests provide a number of advantages over existing procedures. Based upon multiple partitions of R, they decrease the dependence upon any particular partition, and the colresponding classification is precise since the null hypothesis distribution of the test statistic is (asymptotically) known. Finally, they allow a great deal of flexibility in testing for certain alternatives to randomness, and are applicable to one-, two-, and three- dimensional maps.  相似文献   

10.
The boundary value problem in spatial statistical analysis   总被引:2,自引:0,他引:2  
"The primary objective of this paper is to investigate procedures for detecting and handling existing border biasing in spatial statistical analysis." Six conventional solutions to the boundary value problem are criticized, and three alternate statistical solutions are proposed.  相似文献   

11.
In this paper, I reconsider the problem of pottery firing using a large set of comparable data, most of it collected during extensive fieldwork conducted in Africa and Asia. My main purpose is to assess the actual relationships between the firing procedures (structure, fuel, schedule and scale) and some of the firing conditions (time and temperature). Indeed, if different firing procedures result in different firing conditions, the fired pots might display distinct physical characteristics. I will first characterize the various procedures in terms that are both meaningful for anthropologists and likely to influence the thermal profile of a firing. I will then examine the characteristics of the various firing processes in terms of duration, maximum temperature, heating rate and soaking time.  相似文献   

12.
The objective of this paper is to obtain the optimum design of 3D reinforced concrete buildings in terms of their performance under earthquake loading. This goal is achieved by considering the minimisation of the eccentricity between the mass centre and the rigidity centre of each storey layout as the optimisation objective in order to produce torsionally balanced structures. This problem is considered as a combined topology and sizing optimisation problem. The location and the size of the columns and the shear walls of the structure of each storey layout constitute the design variables. Apart from the constraints imposed by the seismic and reinforced concrete structure design codes, architectural restrictions are also taken into account. The test examples showed that a reduction in the structural cost of the building is achieved by minimising the eccentricity between the mass centre and the rigidity centre of each storey layout. Evolutionary optimisation algorithms and in particular a specially tailored algorithm based on Evolution Strategies is implemented for the solution of this type of structural optimisation problems.  相似文献   

13.
Driven by progress in sensor technology, algorithms and data processing capabilities, the recording and 3D virtual modelling of complex archaeological sites is currently receiving much attention. Nevertheless, the problem remains the huge effort and costs that have to be invested to obtain realistic models. Besides on-site measurements, much time is often spent in manually rebuilding the whole site with a CAD package or a 3D-modelling tool.  相似文献   

14.
A family of local, asynchronous, iterative, and parallel procedures (LAIPPs) are described. These procedures are designed to find least cost paths (LCPs) in bounded regions of the plane that have been partitioned into triangular subregions. The unit cost of traversing a given subregion is uniform within the subregion, but varies between subregions. In the case of one-dimensional chains of triangles, the procedures are guaranteed to converge to a global LCP. Although the procedures are guaranteed to terminate, we have so far been unable to prove, in the general case of a two-dimensional triangulation, that the procedures always terminate in admissible paths. Extensive experimental investigations, however, have revealed convergence to admissible paths in all cases examined. While counterexamples are constructed to prove that the procedures may converge to the local rather than global LCPs, convergence to the global LCP was found to occur in nearly all of our experimental investigations. The computational complexity of the procedures, as measured by the number of iterations required for convergence, was found experimentally to be independent of the size of the domain in which the paths lay, and to be slightly greater than linear in the Euclidian length of the paths. The complexity was also found to depend on the cost structure of the domain. Such LAIPPs appear to be promising procedures for both application and further investigation. In particular, the question of convergence is an interesting problem.  相似文献   

15.
Multiple Facilities Location in the Plane Using the Gravity Model   总被引:3,自引:0,他引:3  
Two problems are considered in this article. Both problems seek the location of p facilities. The first problem is the p median where the total distance traveled by customers is minimized. The second problem focuses on equalizing demand across facilities by minimizing the variance of total demand attracted to each facility. These models are unique in that the gravity rule is used for the allocation of demand among facilities rather than assuming that each customer selects the closest facility. In addition, we also consider a multiobjective approach, which combines the two objectives. We propose heuristic solution procedures for the problem in the plane. Extensive computational results are presented.  相似文献   

16.
Protests and opposition to land acquisition from displaced peasants for fair compensation occur on a daily basis in China and have become the most prominent social problem in rural parts of the country. Employing a procedural perspective on conflict, this paper aims to uncover the complexities and tensions that are triggered in the process by drawing on a case of a land confiscation in Jining City, a medium‐size city in Shandong Province, China. Our research shows that conflicts exist at various scales: both between the local governments and rural households and between the village officials and villagers. The paper argues that ambiguity in de jure and de facto land acquisition procedures has resulted in both an escalation of conflict and increasing inequality in the outcomes and benefits of the process. We discuss and conclude that the differences between de jure and de facto procedures in the process of land acquisition are a significant institutional barrier in the resolution of conflict in this important issue for rural China.  相似文献   

17.
This article addresses the problem of specification uncertainty in modeling spatial economic theories in stochastic form. It is ascertained that the traditional approach to spatial econometric modeling does not adequately deal with the type and extent of specification uncertainty commonly encountered in spatial economic analyses. Two alternative spatial econometric modeling procedures proposed in the literature are reviewed and shown to be suitable for analyzing systematically two sources of specification uncertainty, viz., the level of aggregation and the spatio-temporal dynamic structure in multiregional econometric models. The usefulness of one of these specification procedures is illustrated by the construction of a simple multiregional model for The Netherlands.  相似文献   

18.
Thermoluminescence has a great potential for dating archaeometallurgical slags, because the smelting process leads to a well‐defined resetting of the ‘luminescence clock’. However, the complex compositions of slags have unpleasant consequences for TL measurements if the bulk slag substance is used. To overcome this problem, quartz has been separated out of slag matrices by chemical and physical procedures. The TL measurements were carried out on this defined mineral phase. This method was tested with seven slag samples from different locations and ages. In most cases, the TL ages determined show good agreement with reference data.  相似文献   

19.
Two alternative specifications for random fields are reviewed and models from both specifications identified. The paper then deals with statistical tests for discriminating between randomness and some of the models of dependence. These various tests are compared and criteria established for their use. Additional testing procedures are identified for the more difficult problem of distinguishing between models of dependence. The paper then looks at different process generators for certain random field models and concludes with a discussion of the importance of the theory for handling diffusion processes. The value of these results for both model specification and “form-process” studies is emphasized.  相似文献   

20.
This paper compares three methods commonly used to extract fossil phytoliths from sediments. A basic procedure using heavy liquid flotation and oxidation is compared with two other procedures across a range of sediment types commonly encountered in archaeological studies. The three procedures are: (1) a heavy liquid flotation method (HLF); (2) a burning method (POW); and (3) another heavy liquid flotation method (HLFPol) similar to HLF, but adapted to allow the extraction of pollen and spores as well as phytoliths, within a single process. Comparisons of the resulting output using these three techniques for phytolith extraction show that different methods can produce different results, and therefore basic techniques should be modified according to the characteristics of the sediments for which they are used. While all the techniques showed similarities in assemblage results, there were problems associated with disaggregation and effective separation of light and heavy fractions, in particular with the POW procedure. The evidence suggests that morphotype selection occurred both within the physical sorting process and in the process of inverting slides to shed excess residue; in both cases it is difficult to suggest a solution to the problem. The results show clearly that the advantages gained by using the POW procedure are largely outweighed by the problems encountered with its use, and because of possible size/shape selection, it is not recommended for general extraction procedures. The heavy liquid flotation procedures, on the other hand, are shown to produce more concentrated residues with higher levels of clarity and less potential than the POW procedure for sample bias. The use of a non-toxic heavy liquid, sodium polytungstate, now allows the process to be used in relative safety. It is recommended that analysts use heavy liquid flotation procedures with chemical treatments specific to sediment requirements.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号