首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 352 毫秒
1.
Earthquake Early Warning Systems (EEWS), based on real-time prediction of ground motion or structural response measures, may play a role in reducing vulnerability and/or exposition of buildings and lifelines. In fact, recently seismologists developed efficient methods for rapid estimation of event features by means of limited information of the P-waves. Then, when an event is occurring, probabilistic distributions of magnitude and source-to-site distance are available and the prediction of the ground motion at the site, conditioned to the seismic network measures, may be performed in analogy with the Probabilistic Seismic Hazard Analysis (PSHA). Consequently the structural performance may be obtained by the Probabilistic Seismic Demand Analysis (PSDA), and used for real-time risk management purposes. However, such prediction is performed in very uncertain conditions which have to be taken into proper account to limit false and missed alarms. In the present study, real-time risk analysis for early warning purposes is discussed. The magnitude estimation is performed via the Bayesian approach, while the earthquake localization is based on the Voronoi cells. To test the procedure it was applied, by simulation, to the EEWS under development in the Campanian region (southern Italy). The results lead to the conclusion that the PSHA, conditioned to the EEWS, correctly predicts the hazard at the site and that the false/missed alarm probabilities may be controlled by set up of an appropriate decisional rule and alarm threshold.  相似文献   

2.
Assessing the significance of multiple and dependent comparisons is an important, and often ignored, issue that becomes more critical as the size of data sets increases. If not accounted for, false-positive differences are very likely to be identified. The need to address this issue has led to the development of a myriad of procedures to account for multiple testing. The simplest and most widely used technique is the Bonferroni method, which controls the probability that a true null hypothesis is incorrectly rejected. However, it is a very conservative procedure. As a result, the larger the data set the greater the chances that truly significant differences will be missed. In 1995, a new criterion, the false discovery rate (FDR), was proposed to control the proportion of false declarations of significance among those individual deviations from null hypotheses considered to be significant. It is more powerful than all previously proposed methods. Multiple and dependent comparisons are also fundamental in spatial analysis. As the number of locations increases, assessing the significance of local statistics of spatial association becomes a complex matter. In this article we use empirical and simulated data to evaluate the use of the FDR approach in appraising the occurrence of clusters detected by local indicators of spatial association. Results show a significant gain in identification of meaningful clusters when controlling the FDR, in comparison to more conservative approaches. When no control is adopted, false clusters are likely to be identified. If a conservative approach is used, clusters are only partially identified and true clusters are largely missed. In contrast, when the FDR approach is adopted, clusters are fully identified. Incorporating a correction for spatial dependence to conservative methods improves the results, but not enough to match those obtained by the FDR approach.  相似文献   

3.
Genoese commercial practices, among the best known in medieval Europe, relied on notaries and their cartularies to create and preserve valid contracts. This system of generating records raises questions about how private or secret Genoese business was, and whether it was safe from fraud and forgery. Legal and notarial sources reveal methods of organizing commerce that fostered trust by involving suitable witnesses and discreet notaries. The Genoese sacrified some privacy in order to acquire a reputation for predictable and reliable contracts and markets.  相似文献   

4.
The capabilities for visualization, rapid data retrieval, and manipulation in geographic information systems (GIS) have created the need for new techniques of exploratory data analysis that focus on the “spatial” aspects of the data. The identification of local patterns of spatial association is an important concern in this respect. In this paper, I outline a new general class of local indicators of spatial association (LISA) and show how they allow for the decomposition of global indicators, such as Moran's I, into the contribution of each observation. The LISA statistics serve two purposes. On one hand, they may be interpreted as indicators of local pockets of nonstationarity, or hot spots, similar to the Gi and G*i statistics of Getis and Ord (1992). On the other hand, they may be used to assess the influence of individual locations on the magnitude of the global statistic and to identify “outliers,” as in Anselin's Moran scatterplot (1993a). An initial evaluation of the properties of a LISA statistic is carried out for the local Moran, which is applied in a study of the spatial pattern of conflict for African countries and in a number of Monte Carlo simulations.  相似文献   

5.
Road safety research is a data-rich field with large social impacts. Like in medical research, the ambition is to build knowledge around risk factors that can save lives. Unlike medical research, road safety research generates empirical findings from messy observational datasets. Records of road crashes contain numerous intersecting categorical variables, dominating patterns that are complicated by confounding and, when conditioning on data to make inferences net of this, observed effects that are subject to uncertainty due to diminishing sample sizes. We demonstrate how visual data analysis approaches can inject rigor into exploratory analysis of such datasets. A framework is presented whereby graphics are used to expose, model and evaluate spatial patterns in observational data, as well as protect against false discovery. Evidence for the framework is presented through an applied data analysis of national crash patterns recorded in STATS19, the main source of road crash information in Great Britain. Our framework moves beyond typical depictions of exploratory data analysis and transfers to complex data analysis decision spaces characteristic of modern geographical analysis.  相似文献   

6.
This article introduces a procedure for progressively increasing the density of an initial point set that can be used as a basis for interpolating surfaces of variable resolution from sparse samples of data sites. The procedure uses the Simple Recursive Point Voronoi Diagram in which Voronoi concepts are used to tessellate space with respect to a given set of generator points. The construction is repeated every time with a new generator set, which comprises members selected from the previous generator set plus features of the current tessellation. We show how this procedure can be implemented in Arc/Info and present an illustration of its application using three known surfaces and alternative generator point configurations. Initial results suggest that the procedure has considerable potential and we discuss further methods for evaluating and extending it.  相似文献   

7.
The three-dimensional extension of the SAND ( Spatial and Nonspatial Data ) spatial database system is described as is its use for data found in scientific visualization applications. The focus is on surface data. Some of the principal operations supported by SAND involve locating spatial objects in the order of their distance from other spatial objects in an incremental manner so that the number of objects that are needed is not known a priori. These techniques are shown to be useful in enabling users to visualize the results of certain proximity queries without having to execute algorithms to completion as is the case when performing a nearest-neighbor query where a Voronoi diagram (i.e., Thiessen polygon) would be computed as a preprocessing step before any attempt to respond to the query could be made. This is achieved by making use of operations such as the spatial join and the distance semijoin. Examples of the utility of such operations is demonstrated in the context of posing meteorological queries to a spatial database with a visualization component.  相似文献   

8.
城市断裂点理论的验证、扩展及应用   总被引:23,自引:0,他引:23  
P. D. Converse提出的城市断裂点理论被广泛用来确定城市的空间影响范围和城市经济区的划分。由于该理论仅给出了每两个城市间一个断裂点的计算公式,在实际应用中就出现了多种空间分割方法。分析表明,许多方法是不可行、不严密的。本文通过对比分析Voronoi图和城市断裂点理论的性质,提出了扩展断裂点理论和断裂弧的概念。并证明:在匀质平面区域内,如果两个城市点的权重相同,那么其吸引范围的分界线是这两个城市点连线的垂直平分线;如果它们的权重不同,那么其吸引范围的分界线是一个圆弧;平面内所有断裂点的轨迹分别构成常规Voronoi图和加权Voronoi图,并且每个城市点的权重分别等于其中心性强度值的平方根。最后,以河南省为例进行了城市空间影响范围和城市经济区划分的应用分析。  相似文献   

9.
How German were German anarchists in the United States and Brazil? Did the experience of exile and immigration preserve or even heighten a national identity among radicals who openly espoused revolutionary internationalism? Anarchists distinguished between nation and nationality on the one hand, and the state and nationalism on the other. This article examines expressions of nationality by a handful of German anarchist editors and writers from the 1880s to the end of World War II. They wanted to be stateless, but not nationless. This article argues that German exile anarchists in the United States and Brazil expressed a militant, countercultural, antistatist and anticlerical nationality. They were ‘rooted cosmopolitans’: They identified with the international revolutionary tradition and at the same time remained attached to Germany's heritage of radical politics, arts and humanities. There was a remarkable consistency in their commentary levelled against Bismarck, the Kaiser, the Weimar government and the Nazis either in Germany or in the host country. Anarchists advocated for a borderless global federation of free communities and, to that end, rejected nationalism and urged people to stop ‘seeing like a state’ by exposing the false promises and crimes of statism.  相似文献   

10.
L. Gentelli 《Archaeometry》2021,63(1):156-172
This paper details the application of a statistical method for the chronological discrimination of silver coins using counts per second trace elemental, inter‐elemental ratios. The statistical method described is based on a method that has been applied to similar archaeological materials to determine their provenance. The method makes use of the inter‐element association patterns of multi‐element analytical data determined using laser ablation‐inductively coupled plasma‐mass spectrometry (LA‐ICP‐MS). The majority of the 266 coins analysed for this study have already been successfully identified by their mint markings. The data from LA‐ICP‐MS analyses, together with what is known about the coins through visual identification, were used to discriminate the reigning sovereign, and in the case of Mexico, the year of minting, of individual coins within the elemental fingerprint of different mints. Subsequently, unidentified coins can be placed in the confusion matrix, and their trace element information used to identify their year of minting when compared with other, identified coins from the same mint. The interpretational statistical technique linear discriminant analysis (LDA) was used to explore an identification of year of minting of coins that have previously not been identified by other means based on a statistical comparison against a database of compositional analysis of silver coins of known year of minting.  相似文献   

11.
An interpretive approach to political science provides accounts of actions and practices that are interpretations of interpretations. We develop this argument using the idea of ‘situated agency’. There are many common criticisms of such an approach. This paper focuses on nine: that an interpretive approach is mere common sense; that it focuses on beliefs or discourses, not actions or practices; that it ignores concepts of social structure; that it seeks to understand actions and practices, not to explain them; that it is concerned exclusively with qualitative techniques of data generation; that it must accept actors' own accounts of their beliefs; that it is insensitive to the ways in which power constitutes beliefs; that it is incapable of producing policy-relevant knowledge; and that it is incapable of producing objective knowledge. We show that the criticisms rest on both misconceptions about an interpretive approach and misplaced beliefs in the false idols of hard data and rigorous methods.  相似文献   

12.
This article concerns the point feature cartographic label placement (PFCLP) problem, which is a NP‐hard (Non‐deterministic Polynomial‐time hard) combinatorial problem. It is considered that when all points must be labeled and overlaps are inevitable, the map can be more readable if overlapping labels are placed in a dispersive way, that is, overlapping labels are distant from each other. This work presents a constructive genetic algorithm (CGA) for the discrete dispersion PFCLP that utilizes the notion of masking to preserve optimal subsequences in chromosomes. We also define the discrete dispersion PFCLP as a mixed integer linear programming model, considering the problem of the minimum number of labels in conflict as well. The computational results validate our CGA approach using instances up to 5,046 points.  相似文献   

13.
While established ethical norms and core legal principles concerning the protection of privacy may be easily identified, applying these standards to rapidly evolving digital information technologies, markets for digital information and convulsive changes in social understandings of privacy is increasingly challenging. This challenge has been further heightened by the increasing creation of, access to, and sophisticated nature of geocoded data, that is, data that contain time and global location components. This article traces the growing need for, and the structural challenges to creating educational curricula that address the ethical and privacy dimensions of geospatial data.  相似文献   

14.
The 9/11 attacks made the war on terror the central plank of American grand strategy. Yet despite its importance in shaping US policy choices, there has been considerable confusion over how the war on terror relates to foreign policy goals. This article attempts to locate the war on terror within American grand strategy and makes three claims. First, it argues that the Bush administration's approach to the war on terror rests on a false analogy between terrorism and fascism or communism. This has led to misinterpretations of the goals of the war on terror and to a persistent misuse of American power. Second, it suggests that the central purpose of the war on terror should be to de‐legitimize terror as a tactic and to induce states to assume responsibility for controlling terrorists within their borders. American grand strategy should be focused on creating a normative anti‐terror regime with costly commitments by linchpin states—defined as great powers and crucial but endangered allies such as Pakistan and Saudi Arabia—rather than on conducting regime change against rogue states on the margins of the international system. Success in the war on terror should be measured not by the perceived legitimacy of discrete US policy choices, but by the number of these crucial states who accept the de‐legitimation of terrorism as a core foreign policy principle and act accordingly. Third, it argues that bilateral enforcement of an anti‐terror regime imposes high costs for US power and puts other elements of American grand strategy— including the promotion of democracy and the promotion of human rights—at risk. To reduce these costs and to preserve American power over the long‐term, the US should attempt to institutionalize cooperation in the war on terror and to scale back ambitious policy choices (such as achieving a democratic revolution in the Middle East) which increase the risks of state defection from the anti‐terror regime.  相似文献   

15.
16.
17.
Resin found within Canaanite amphorae from the Late Bronze Age shipwreck discovered off the coast of southwest Turkey at Uluburun has previously been identified as Pistacia sp. Although evidence from Egypt suggests that this resin was in high demand and typically transported in such amphorae, it has also been proposed that the amphorae contained wine, with the resin used to seal the interior surfaces and to flavour and/or preserve the wine. To attempt to resolve this question, we have analysed five samples of pistacia resin found in amphorae from the shipwreck using a range of analytical techniques which have used in the past for the analysis of wine residues: spot tests, FT-IR, and HPLC-MS-MS. As well as the archaeological samples, we have analysed modern samples of pistacia resin, leaves and fruit to determine the effectiveness of each technique and to exclude the possibility of false positive results. In addition to the analyses for wine we also detail analysis (GC-MS) of the terpenoids for the purpose of further molecular characterisation of the resin. Bulk stable isotope analysis was used in comparison with similar resins to attempt to identify the geographical origin of the resin.  相似文献   

18.
Abstract

While researching the histories of 492 German soldiers killed in Southern France in August and September 1944, three cases of soldiers having falsely been reported as killed in action were discovered. There were different reasons for each of the misidentifications; in the first case, the precise circumstances are unclear, but may have occurred after an accidental exchange of identification tags with a fellow soldier; the second case was probably caused by a mistaken report from a witness; the third seems to have been misidentification by medical personnel unfamiliar with the bodies they were dealing with. The Wehrmacht used poorly designed identification tags, while there was no use of methods such as fingerprinting and tooth charts when identification tags were not available. Unreliable methods such as visual identification or witness testimony were deemed to be sufficient to report a soldier dead. As a consequence, false reports of death seem to have been relatively commonplace.  相似文献   

19.
The availability of individual-level health data presents opportunities for monitoring the distribution and spread of emergent, acute, and chronic conditions, as well as challenges with respect to maintaining the anonymity of persons with health conditions. Particularly when such data are mapped as point locations, concerns arise regarding the ease with which individual identities may be determined by linking geographic coordinates to digital street networks, then determining residential addresses and, finally, names of occupants at specific addresses. The utility of such data sets must therefore be balanced against the requirements of protecting the confidentiality of individuals whose identities might be revealed through the availability of precise and accurate locational data. Recent literature has pointed toward geographic masking as a means for striking an appropriate balance between data utility and confidentiality. However, questions remain as to whether certain characteristics of the mask (mask metadata) should be disclosed to data users and whether two or more distinct masked versions of the data can be released without breaching confidentiality. In this article, we address these questions by quantifying the extent to which the disclosure of mask metadata and the release of multiple masked versions may affect confidentiality, with a view toward providing guidance to custodians of health data sets. The masks considered include perturbation, areal aggregation, and their combination. Confidentiality is measured by the areas of confidence regions for individuals' locations, which are derived under the probability models governing the masks, conditioned on the disclosed mask metadata.  相似文献   

20.
This essay reflects on how technological changes in biomedicine can affect what archival sources are available for historical research. Historians and anthropologists have examined the ways in which old biomedical samples can be made to serve novel scientific purposes, such as when decades-old frozen tissue specimens are analyzed using new genomic techniques. Those uses are also affected by shifting ethical regimes, which affect who can do what with old samples, or whether anything can be done with them at all. Archival collections are subject to similar dynamics, as institutional change and shifts in ethical guidelines and privacy laws affect which sources can be accessed and which are closed. I witnessed just such a change during my research into human genetics using archives in the Wellcome Collection. A few years into my project, those archives had their privacy conditions reassessed, and I saw how some sources previously seen as neutral were now understood to contain personal sensitive information. This paper describes the conditions of this shift—including the effects of technological change, new ethical considerations, and changing laws around privacy. I reflect on how these affected my understanding of the history of human genetics, and how I and others might narrate it.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号