id
stringlengths
7
7
title
stringlengths
3
578
abstract
stringlengths
0
16.7k
keyphrases
sequence
prmu
sequence
53wv7Eq
Function granularity estimation for multimodal optimization
In this article, we study the subspace function granularity and present a method to estimate the sharing distance and the optimal population size. To achieve multimodad function optimization, niching techniques diversify the population of Evolutionary Algorithms (EA) and encourage heterogeneous convergence to multiple optima. The key to a successful diversification is effective resource sharing. Without knowing the fitness landscape, resource sharing is usually determined by uninformative assumptions on the number of peaks. Using the Probably Approximately Correct (PAC) learning theory and the epsilon-cover concept, a PAC neighborhood for a set of samples is derived. Within this neighborhood, we sample the fitness landscape and compute the subspace Fitness Distance Correlation (FDC) coefficients. Using the estimated granularity feature of the fitness landscape, the sharing distance and the population size are determined. Experiments demonstrate that by using the estimated population size and sharing distance an Evolutionary Algorithm. successfully identifies multiple optima.
[ "granularity", "optimization", "sharing distance", "fitness landscape" ]
[ "P", "P", "P", "P" ]
3Dsbj6u
Global optimization of general nonconvex problems with intermediate polynomial substructures
This work considers the global optimization of general nonconvex nonlinear and mixed-integer nonlinear programming problems with underlying polynomial substructures. We incorporate linear cutting planes inspired by reformulation-linearization techniques to produce tight subproblem formulations that exploit these underlying structures. These cutting plane strategies simultaneously convexify linear and nonlinear terms from multiple constraints and are highly effective at tightening standard linear programming relaxations generated by sequential factorable programming techniques. Because the number of available cutting planes increases exponentially with the number of variables, we implement cut filtering and selection strategies to prevent an exponential increase in relaxation size. We introduce algorithms for polynomial substructure detection, cutting plane identification, cut filtering, and cut selection and embed the proposed implementation in BARON at every node in the branch-and-bound tree. A computational study including randomly generated problems of varying size and complexity demonstrates that the exploitation of underlying polynomial substructures significantly reduces computational time, branch-and-bound tree size, and required memory.
[ "reformulation-linearization techniques", "branch-and-bound global optimization", "factorable polyhedral relaxation", "polynomial programming" ]
[ "P", "R", "M", "R" ]
4BCM786
Collective Suttee: Is It Unjust to Develop Life Extension if It Will Not Be Possible to Provide It to Everyone?
If we can anticipate that life extension will be too expensive to provide to everyone, is that a reason not to research and develop it?Collective suttee is the policy of inhibiting or prohibiting life extension on such grounds
[ "life extension", "access", "antiaging", "anti-aging", "extended life", "equality", "immortality", "justice", "prolongevism", "prolongevity" ]
[ "P", "U", "U", "U", "M", "U", "U", "U", "U", "U" ]
-G5WHL2
Using B SP and Python to simplify parallel programming
Scientific computing is usually associated with compiled languages for maximum efficiency. However, in a typical application program, only a small part of the code is time-critical and requires the efficiency of a compiled language. It is often advantageous to use interpreted high-level languages for the remaining tasks, adopting a mixed-language approach. This will be demonstrated for Python, an interpreted object-oriented high-level language that is well suited for scientific computing. Particular attention is paid to high-level parallel programming using Python and the BSP model. We explain the basics of BSP and how it differs from other parallel programming tools like MPI. Thereafter we present an application of Python and BSP for solving a partial differential equation from computational science, utilizing high-level design of libraries and mixed-language (PythonC or PythonFortran) programming.
[ "python", "parallel programming", "bsp" ]
[ "P", "P", "P" ]
bs3riD5
PROMETHEE: A comprehensive literature review on methodologies and applications
In recent decades, several Multi-Criteria Decision Aid (MCDA) methods have been proposed to help in selecting the best compromise alternatives. In the meantime, the PROMETHEE (Preference Ranking Organization Method for Enrichment Evaluations) family of outranking methods and their applications has attracted much attention from academics and practitioners. In this paper, a classification scheme and a comprehensive literature review are presented in order to uncover, classify, and interpret the current research on PROMETHEE methodologies and applications. Based on the scheme, 217 scholarly papers from 100 journals are categorized into application areas and non-application papers. The application areas include the papers on the topics of Environment Management, Hydrology and Water Management, Business and Financial Management, Chemistry, Logistics and Transportation, Manufacturing and Assembly, Energy Management, Social, and Other Topics. The last area covers the papers published in several fields: Medicine, Agriculture, Education, Design, Government and Sports. The scholarly papers are also classified by (1) year of publication, (2) journal of publication, (3) authors nationality, (4) PROMETHEE as applied with other MCDA methods, and (5) PROMETHEE as applied with GAIA (Geometrical Analysis for Interactive Aid) plane. It is hoped that the paper can meet the needs of researchers and practitioners for easy references of PROMETHEE methodologies and applications, and hence promote the future of PROMETHEE research.
[ "promethee", "literature review", "mcda", "outranking", "application areas", "gaia" ]
[ "P", "P", "P", "P", "P", "P" ]
FcB3xBs
A text summarizer for Arabic ?
Automatic text summarization is an essential tool in this era of information overloading. In this paper we present an automatic extractive Arabic text summarization system where the user can cap the size of the final summary. It is a direct system where no machine learning is involved. We use a two pass algorithm where in pass one, we produce a primary summary using Rhetorical Structure Theory (RST); this is followed by the second pass where we assign a score to each of the sentences in the primary summary. These scores will help us in generating the final summary. For the final output, sentences are selected with an objective of maximizing the overall score of the summary whose size should not exceed the user selected limit. We used Rouge to evaluate our system generated summaries of various lengths against those done by a (human) news editorial professional. Experiments on sample texts show our system to outperform some of the existing Arabic summarization systems including those that require machine learning.
[ "automatic text summarization", "rhetorical structure theory", "arabic nlp", "0/1-knapsack", "rouge rouge" ]
[ "P", "P", "M", "U", "R" ]
2Rwc7mb
Effective stiffness and microscopic deformation of an orthotropic plate containing arbitrary holes
Many engineering materials and structures, such as cellular structures, sandwich core structures and laminated plates with holes, can be modeled by an inclusion problem with anisotropic matrix. The paper studies the effective properties and the microscopic deformation of anisotropic plates with periodic holes by using direct and mathematical homogenization. The effective stiffnesses are calculated by different homogenization methods and the microscopic deformation of a RVE is modeled by the finite element method for the plate with arbitrarily shaped holes. All of the effective stiffness coefficients, especially stretchingshear coupling coefficients are evaluated.
[ "microscopic deformation", "inclusions", "anisotropic matrix", "effective properties", "homogenization" ]
[ "P", "P", "P", "P", "P" ]
9JfmhL:
A new stochastic resonance algorithm to improve the detection limits for trace analysis
Based on the stochastic resonance theory, a new stochastic resonance algorithm (SRA) to improve analytical detection limits for trace analysis is presented. In the new algorithm, stochastic resonance takes place in a bistable system driven only by the inherent noise of an analytical signal. The effect of the system parameters on the proposed algorithm is discussed and the optimization of parameters is studied. By using experimental chromatographic and spectroscopic data sets, it is proven that the signal-to-noise ratio (SNR) of the analytical signal can be greatly enhanced by the method, and an excellent quantitative relationship between different concentrations and their responses can be obtained. Stochastic resonance may be a promising tool to extend instrumental linear range and to improve the accuracy of micro- or trace analysis. (C) 2003 Elsevier Science B.V All rights reserved.
[ "stochastic resonance", "detection limits", "gas chromatography", "raman spectroscopy" ]
[ "P", "P", "U", "U" ]
-vFzXHW
Exploring syntactic structured features over parse trees for relation extraction using kernel methods ?
Extracting semantic relationships between entities from text documents is challenging in information extraction and important for deep information processing and management. This paper proposes to use the convolution kernel over parse trees together with support vector machines to model syntactic structured information for relation extraction. Compared with linear kernels, tree kernels can effectively explore implicitly huge syntactic structured features embedded in a parse tree. Our study reveals that the syntactic structured features embedded in a parse tree are very effective in relation extraction and can be well captured by the convolution tree kernel. Evaluation on the ACE benchmark corpora shows that using the convolution tree kernel only can achieve comparable performance with previous best-reported feature-based methods. It also shows that our method significantly outperforms previous two dependency tree kernels for relation extraction. Moreover, this paper proposes a composite kernel for relation extraction by combining the convolution tree kernel with a simple linear kernel. Our study reveals that the composite kernel can effectively capture both flat and structured features without extensive feature engineering, and easily scale to include more features. Evaluation on the ACE benchmark corpora shows that the composite kernel outperforms previous best-reported methods in relation extraction.
[ "syntactic structured features", "relation extraction", "information extraction", "convolution tree kernel", "composite kernel" ]
[ "P", "P", "P", "P", "P" ]
Hgytp2x
A case study of software process improvement with CMMI-DEV and Scrum in Spanish companies
One of the most commonly used agile methods is Scrum. Capability Maturity Model Integration for Development (CMMI-DEV) is currently the de facto framework for process improvement and for determining the organizational maturity of software development companies. CMMI-DEV and Scrum share certain characteristics, and even though they were developed for different purposes, they can be complementary to each other; and as such, they are not in competition. This paper presents a case study of the relationship between level 2 of CMMI-DEV 1.3 and Scrum. This research has focused on the relationships between Scrum and level 2 of CMMI-DEV 1.3. The objective of this research paper is to evaluate how Scrum helps implement a process model such as CMMI-DEV. A detailed case study was conducted among Spanish IT companies. The case study was designed according to established guidelines for cases studies. There were eight principal activities: case study design, case selection, case study procedures and roles, data collection, analysis, plan validity, study limitations, and reporting. The results obtained show that most of the process areas of CMMI-DEV level 2 had been improved by using Scrum. Other issues detected arose during the formal appraisals and illustrated how it is possible to verify with Scrum that the specific goals of CMMI-DEV have been implemented. In addition, it highlights how the use of open-source tools was useful in improving the process in the companies involved. Based on the case study carried out, the addition of Scrum methodologies may improve the increase in quality of software processes. Copyright (c) 2013 John Wiley & Sons, Ltd.
[ "case study", "software process improvement", "cmmi-dev", "scrum", "agile" ]
[ "P", "P", "P", "P", "P" ]
mNs5UHR
Multiple wire reconnections based on implication flow graph
Global flow optimization (GFO) can perform multiple fanout/fanin wire reconnections at a time by modeling the problem of multiple wire reconnections with a flow graph, and then solving the problem using the maxflow-mincut algorithm on the flow graph. In this article, we propose an efficient multiple wire reconnection technique that modifies the framework of GFO, and as a result, can obtain better optimization quality. First, we observe that the flow graph in GFO cannot fully characterize wire reconnections, which causes the GFO to lose optimality in several obvious cases. In addition, we find that fanin reconnection can have more optimization power than fanout reconnection, but requires more sophisticated modeling. We reformulate the problem of fanout/fanin reconnections by a new graph, called the implication flow graph (IFG). We show that the problem of wire reconnections on the implication flow graph is NP-complete and also propose an efficient heuristic on the new graph. To demonstrate the effectiveness of our proposed method, we conduct an application which utilizes the flexibility of the wire reconnections explored in the logic domain to further minimize interconnects in the physical layout. Our experimental results are very exciting.
[ "multiple wire reconnection", "global flow optimization (gfo)", "algorithms", "implication flow graph (ifg)", "automatic test pattern generation (atpg)", "redundant wire", "mandatory assignment" ]
[ "P", "P", "P", "P", "M", "M", "U" ]
4vZ69K2
Incorporating scene priors to dense monocular mapping
This paper presents a dense monocular mapping algorithm that improves the accuracy of the state-of-the-art variational and multiview stereo methods by incorporating scene priors into its formulation. Most of the improvement of our proposal is in low-textured image regions and for low-parallax camera motions; two typical failure cases of multiview mapping. The specific priors we model are the planarity of homogeneous color regions, the repeating geometric primitives of the scenethat can be learned from dataand the Manhattan structure of indoor rooms. We evaluate the performance of our method in our own sequences and in the publicly available NYU dataset, emphasizing its strengths and weaknesses in different cases.
[ "monocular slam", "3d reconstruction", "structure from motion" ]
[ "M", "U", "R" ]
kzUk::t
Study on a distributed wavelength routing algorithm in WDM optical transport networks
A new design scheme for a distributed algorithm for routing and wavelength assignment (RWA) is developed in this paper, and the communication rules between the nodes to exchange signaling packets are discussed. The Adaptive-Alternate-Routing-Least-Load (AARLL) algorithm is implemented in the distributed scheme for the first time. Under dynamic traffic circumstances, the influence of the race condition, which does not exist in centralized scheme, on the network performance is analyzed and the analyzed results show that the race condition has a major impact on network performance only under light traffic load, while under medium and heavy traffic load the impact is very small. To analyze the performance loss caused by adopting the distributed algorithm for RWA, the capacity loss factor (CLF) is introduced and the calculated results show that CLF does not exceed 6% under medium traffic load.
[ "dynamic traffic", "race condition", "distributed routing and wavelength assignment", "blocking probability" ]
[ "P", "P", "R", "U" ]
4LDCPjX
Open maps, behavioural equivalences, and congruences
Spans of open maps have been proposed by Joyal, Nielsen, and Winskel as a way of adjoining an abstract equivalence, P-bisimilarity, to a category of models of computation M, where P is an arbitrary subcategory of observations. Part of the motivation was to recast and generalise Milner's well-known strong bisimulation In this categorical setting. An issue left open was the congruence properties of P-bisimilarity. We address the following fundamental question: given a category of models of computation M and a category of observations P, are there any conditions under which algebraic constructs viewed as functors preserve P-bisimilarity? We define the notion of functors being P-factorisable, show how this ensures that P-bisimilarity is a congruence with respect to such functors. Guided by the definition of P-factorisability we show how it is possible to parametrise proofs of functors being P-factorisable with respect to the category of observations P, i.e., with respect to a behavioural equivalence.
[ "open maps", "congruences", "bisimulation", "process algebra", "category theory" ]
[ "P", "P", "P", "M", "M" ]
4RRaS6F
Recurrent neural networks employing Lyapunov exponents for analysis of ECG signals
An approach based on the consideration that electrocardiogram (ECG) signals are chaotic signals was presented for automated diagnosis of electrocardiographic changes. This consideration was tested successfully using the nonlinear dynamics tools, like the computation of Lyapunov exponents. Recurrent neural network (RNN) was implemented and used as basis for detection of variabilities of ECG signals. Four types of ECG beats (normal beat, congestive heart failure beat, ventricular tachyarrhythmia beat, atrial fibrillation beat) obtained from the PhysioBank database were classified. Decision making was performed in two stages: computing features which were then input into the RNN and classification using the RNN trained with the LevenbergMarquardt algorithm. The research demonstrated that the Lyapunov exponents are the features which are well representing the ECG signals and the RNN trained on these features achieved high classification accuracies.
[ "lyapunov exponents", "electrocardiogram (ecg) signals", "chaotic signal", "recurrent neural networks (rnns)" ]
[ "P", "P", "P", "P" ]
2d3rmwz
Cross-institutional assessment: Development and implementation of the on-line student survey system
As ABET has increased the need for routine student assessments, engineering faculty are faced with the problem of doing this in an efficient manner that minimizes the time required to conduct, tabulate, and analyze the requisite surveys. To meet this need, researchers at the University of Pittsburgh have developed the On-line Student Survey System (OS3) to facilitate EC 2000 assessment and cross-institutional benchmarking OS3 allows multiple engineering schools to conduct customized, routine program evaluations using Web-based surveys specifically designed to meet EC 2000 objectives. Since its inception, seven engineering schools have adopted OS3, This article provides an overview of the system, a description of its survey instruments, and an evaluation of the system. (C) 2002 Wiley Periodicals, Inc.
[ "ec 2000", "assessment and evaluation", "outcomes assessment", "web-based assessment" ]
[ "P", "R", "M", "R" ]
3tGNA-4
Computing and visualizing banks sets of dominance relations using relation algebra and RELVIEW
In social choice theory the Banks set is a well-established choice set for tournaments that consists of the undominated elements of the maximal subtournaments. For non-complete dominance relations J. Duggan proposed three possibilities to modify it. We develop relation-algebraic specifications to compute the Banks set, Duggan's modifications, and variants of them. All these specifications are algorithmic and can directly be translated into the programming language of the computer algebra system RELVIEW. We show that the system is well suited for computing and visualizing the Banks set, its modifications, and the objects to be associated with them. (C) 2013 Elsevier Inc. All rights reserved.
[ "banks set", "dominance relation", "relation algebra", "social choice theory", "relation-algebraic specification", "relview tool", "binary decision diagram" ]
[ "P", "P", "P", "P", "P", "M", "U" ]
38whJ3m
An Analytical Framework for Characterizing Restricted Two Dimensional Cellular Automata Evolution
This paper reports an analytical framework to study a restricted class of Two Dimensional CA. The imposed restriction enables characterization of 2DCA evolution. The concept of Rule Vector Graph (RVG) introduced in [6,7] for characterizing one dimensional CA has been extended in this paper for 2DCA. Design of Rule Vector Graph (RVG) for restricted class of 2DCA is reported. Analysis of RVG enables identification of necessary and sufficient conditions for a 2DCA to be invertible. Further, traversal of the RVG enables checking of reversibility and also identification of all Non-Reachable States (NRSs) and Self Loop States (SLSs) of a (m * n) 2DCA in O(n.2(m)) time complexity.
[ "rule vector graph (rvg)", "2d cellular automata (2dca)", "restricted 2d ca", "serializable rule" ]
[ "P", "M", "M", "M" ]
3Q79LeS
Problems in the interplay of development and IT operations in system development projects: A Delphi study of Norwegian IT experts
The assumption of the presented work is that the ability of system developers and IT operations personnel to cooperate effectively in system development projects has great impact on the quality of the final system solution, as well as on the service level of its subsequent operation. The present research explores the interplay of system development and IT operations and the challenges they are meeting. We are especially interested in identifying problems encountered between these two parties in system development projects. We identify and rank problems by using a ranking-type Delphi study. We involved 42 Norwegian IT experts and split them into three expert panels: system developers, IT operations personnel and system owners. We then guided them through the three phases of the Delphi method brainstorming, reduction and ranking. A comprehensive list of 66 problems, organized into seven groups, is compiled. Through a selection and ranking procedure, the panels found the following to be the six most serious problems in the interplay of system development and IT operations: (1) IT operations not being involved in the requirements specification; (2) poor communication and information flow; (3) unsatisfactory test environments; (4) lack of knowledge transfer; (5) systems being put into production before they are complete; and (6) operational routines not being established prior to deployment. The sheer amount and variety of problems mentioned and the respondents explanations confirm that this interplay needs attention; the parties agree that they do not cooperate effectively in development projects. The results imply that IT operations should be regarded as an important stakeholder throughout several systems development activities, especially requirements analysis, testing and deployment. Moreover, such involvement should be facilitated by an increased focus on enhancing cooperation and communication.
[ "problems", "interplay", "it operations", "system development", "the delphi method" ]
[ "P", "P", "P", "P", "P" ]
4Td9M1:
Computing the Hausdorff distance between two B-spline curves
This paper presents a geometric pruning method for computing the Hausdorff distance between two B-spline curves. It presents a heuristic method for obtaining the one-sided Hausdorff distance in some interval as a lower bound of the Hausdorff distance, which is also possibly the exact Hausdorff distance. Then, an estimation of the upper bound of the Hausdorff distance in an sub-interval is given, which is used to eliminate the sub-intervals whose upper bounds are smaller than the present lower bound. The conditions whether the Hausdorff distance occurs at an end point of the two curves are also provided. These conditions are used to turn the Hausdorff distance computation problem between two curves into a minimum or maximum distance computation problem between a point and a curve, which can be solved well. A pruning technique based on several other elimination criteria is utilized to improve the efficiency of the new method. Numerical examples illustrate the efficiency and the robustness of the new method.
[ "hausdorff distance", "root-finding method", "geometric pruning technique" ]
[ "P", "M", "R" ]
4Yow-ZK
How can AI systems deal with large and complex problems?
Problems arising in the near future may grow larger and more complex beyond the human capability for managing them and it is worrying that this can cause trouble for human society. The only possible way to solve this problem is to make computers more intelligent to back up the weak points of human beings. At least, it is necessary to provide a method for capturing and recording all human decisions in problem-solving, This view leads one to an idea of a computer-led interactive system such that the computer becomes able to manage problem-serving process and makes records of individual decisions made by a human being. Autonomy, generality and practicality are the major requirements for such computer-led interactive systems. New information technology to meet this condition is discussed. It is very different from the conventional computer technology in many aspects. The major topics are: a model-based software architecture for future information systems, a modeling scheme to accept and represent a wide area of problems, a method for externalizing human ideas and of representing it as a model, a method for solving problems represented in the form of a model and so on. It outlines a part of the research work being promoted by the author's group under the sponsorship of the Science and Technology Agency of the Japanese Government.
[ "intelligent information systems", "multistrata model", "problem decomposition", "multiagent system generation", "model evolution" ]
[ "R", "M", "M", "M", "M" ]
2zfr7qs
Improvement of the muscle fractional multimodel for low-rate stimulation
Modeling of isometric contractions, due to motor unit (MU) stimulations of Peroneus digiti quarti and peroneus brevis rat muscles, is presented. The modeling is realized through a multimodel, which allows distinguishing asymmetric contractions and relaxation mechanisms for time domain identification. First, this paper compares two fractional functions and a rational transfer function that model each phase of IIA and IIB MU twitch. The advantages of using fractional functions are underlined, since the number of parameters are minimized. Indeed, fractional models, due to its infinite dimension nature, are particularly adapted not only to model complex systems with few parameters but also to obtain a real time exploitable model for a salamander robot simulator. Finally, muscle response for 10Hz pulse stimulation shows non-stationary characteristics. A method, modeling the transient state of muscle responses and introducing time varying parameters, is presented.
[ "muscle", "multimodel", "modeling", "system identification", "fractional systems", "biomedical" ]
[ "P", "P", "P", "R", "R", "U" ]
1jYzoPq
DOCKSCORE: a webserver for ranking protein-protein docked poses
Proteins interact with a variety of other molecules such as nucleic acids, small molecules and other proteins inside the cell. Structure-determination of protein-protein complexes is challenging due to several reasons such as the large molecular weights of these macromolecular complexes, their dynamic nature, difficulty in purification and sample preparation. Computational docking permits an early understanding of the feasibility and mode of protein-protein interactions. However, docking algorithms propose a number of solutions and it is a challenging task to select the native or near native pose(s) from this pool. DockScore is an objective scoring scheme that can be used to rank protein-protein docked poses. It considers several interface parameters, namely, surface area, evolutionary conservation, hydrophobicity, short contacts and spatial clustering at the interface for scoring.
[ "docking", "interactome", "protein interfaces", "quaternary structure prediction" ]
[ "P", "U", "R", "U" ]
1k-spoX
patterns for service composition
The discovery of suitable services is a crucial, but challenging activity during service-oriented engineering. While in many scenarios a single service will satisfy the user's exigent needs, there are cases where a combination of services might be appropriate. In this work-in-progess paper we identify several composition patterns that assist in the discovery of appropriate services. We outline a pattern based algorithm for service discovery and formalize the solution set.
[ "service composition", "software engineering", "formal models", "service-oriented architecture", "theory" ]
[ "P", "M", "M", "M", "U" ]
XGc:uiN
Wireless multimedia delivery over 802.11e with cross-layer optimization techniques
The use of wireless networks has spread further than simple data transfer to delay sensitive and loss tolerant multimedia applications. Over the past few years, wireless multimedia transmission across Wireless Local area Networks (WLANs) has gained a lot of attention because of the introduction of technologies such as Bluetooth, IEEE 802.11, 3G, and WiMAX. The IEEE 802.11 WLAN has become a dominating technology due to its low cost and ease of implementation. But, transmitting video over WLANs in real time remains a challenge because it imposes strong demands on video codec, the underlying network, and the Media Access Control (MAC) layer. This paper presents a cross-layer mapping algorithm to improve the quality of transmission of H.264 (a recently-developed video coding standard of the ITU-T Video Coding Experts Group) video stream over IEEE 802.11e-based wireless networks. The major goals of H.264 standard were on improving the rate distortion and the enhanced compression performance. Our proposed cross-layer design involves the mapping of H.264 video slices (packets) to appropriate access categories of IEEE 802.11e according to their information significance. We evaluate the performance of our proposed cross-layer design and the results obtained demonstrate its effectiveness in exploiting characteristics of the MAC and application layers to improve the video transmission quality.
[ "wireless", "multimedia", "802.11e", "cross-layer", "network", "video", "h.264", "compression" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
1TxiaoJ
Computerized antimicrobial decision support: an offline evaluation of a database-driven empiric antimicrobial guidance program in hospitalized patients with a bloodstream infection
Introduction: We developed a computerized antimicrobial guidance program based on the last 5 years of our laboratory culture data augmented by expert infectious disease logic. The program is designed to assist physicians with the targeting of empiric antimicrobials for hospitalized patients by tracking pathogenic bacteria and their evolving antimicrobial resistance profiles. Costs, toxicities, and environmental impact of antimicrobial use also influence the final recommendations. We undertook the following analysis to verify its potential safety and efficacy in hospitalized patients with a bloodstream infection. Methods: We retrospectively enrolled all inpatients with a positive blood culture for a previously undetermined pathogen during the first 6 months of 2002 and determined the empiric therapy initiated within the 12h before and after the time of culture. Antimicrobial recommendations from the microbiologic decision support tool were then determined by matching specimen (blood), hospital unit, community- versus hospital-acquired category, age category, and gender. Generated antimicrobial recommendations were tailored to patient allergies, age category, and presence of pregnancy, lactation, or hepatic impairment. Results: The microbiology laboratory recorded 226 unique patient/pathogen blood cultures during the study period. Physicians initiated effective empiric therapy in 150 of the 226 cases, for an effectiveness rate of 66%. The computer-guided therapy was effective in 195 of the 226 cases for a rate of 86%. A contingency table analysis showed 55 cases where the computer recommendation was effective but the physicians selection was not, and eight cases where the physicians antimicrobials were effective but the computers were not (P<0.0001). Discussion: For patients with a bloodstream infection, we found that our computer-guided statistically-derived antimicrobial therapy would potentially improve the rate of effectiveness of empirically chosen antimicrobials.
[ "antibacterial agents/therapeutic use", "bacterial infections/drug therapy", "decision support systems", "clinical", "expert systems", "evaluation studies" ]
[ "M", "M", "M", "U", "M", "R" ]
4JNGj8a
Optimal threshold value of failure-rate for leased products with preventive maintenance actions
This paper investigates the optimal threshold value of failure rates for leased products with a Weibull lifetime distribution. Within a lease period, any product failure is rectified by minimal repairs and the lessor may incur a penalty when the time required to perform a minimal repair exceeds a reasonable time limit. To reduce product failures, additional preventive maintenance actions are carried out when the failure rate reaches a threshold value. Under this maintenance scheme, a mathematical model of the expected total cost is established. Based on the model, the optimal threshold value and the corresponding maintenance degrees are derived such that the expected total cost is minimized. The structural properties of the optimal policy are investigated and an efficient algorithm is provided to search for the optimal policy. Finally, numerical examples are given to illustrate the features of the optimal policy. (C) 2007 Elsevier Ltd. All rights reserved.
[ "threshold value", "leased product", "preventive maintenance", "minimal repair" ]
[ "P", "P", "P", "P" ]
4ZmCYCE
Stable and controllable noise
We introduce a stable noise function with controllable properties. The well-known Perlin noise function is generated by interpolation of a pre-defined random number table. This table must be modified if user-defined constraints are to be satisfied, but modification can destroy the stability of the table. We integrate statistical tools for measuring the stability of a random number table with user constraints within an optimization procedure, so as to create a controlled random number table which nevertheless has a uniform random distribution, no periodicity, and a band-limited property.
[ "noise", "noise control", "random number generation", "procedural textures" ]
[ "P", "R", "R", "M" ]
-wDACFK
Performance analysis of data transmission in MC-CDMA radio interface with turbo codes
Multi-carrier code division multiple access (MC-CDMA) technique is a combination of two radio access techniques: CDMA and orthogonal frequency division multiplexing and has the advantages of both techniques. The paper presents the design of transmitter and receiver for MC-CDMA radio interface. It also presents encoders and decoders of turbo codes which were used in simulation of the MC-CDMA technique. Two turbo codes with 8-state recursive systematic convolutional were used in the simulation. The simulation results of the transmission quality in additive white Gaussian noise (AWGN) channel presented in the paper show bit error rate and frame error rate performance of MC-CDMA technique with the turbo codes and allow for comparison the performance of both turbo codes.
[ "mc-cdma", "turbo codes", "sova algorithm" ]
[ "P", "P", "U" ]
-1B2k9G
semantic tags generation and retrieval for online advertising
One of the main problems in online advertising is to display ads which are relevant and appropriate w.r.t. what the user is looking for. Often search engines fail to reach this goal as they do not consider semantics attached to keywords. In this paper we propose a system that tackles the problem by two different angles: help (i) advertisers to create more efficient ads campaigns and (ii) ads providers to properly match ads content to keywords in search engines. We exploit semantic relations stored in the DBpedia dataset and use an hybrid ranking system to rank keywords and to expand queries formulated by the user. Inputs of our ranking system are (i) the DBpedia dataset; (ii) external information sources such as classical search engine results and social tagging systems. We compare our approach with other RDF similarity measures, proving the validity of our algorithm with an extensive evaluation involving real users.
[ "semantic tagging", "dbpedia", "rdf ranking", "computational advertising" ]
[ "P", "P", "R", "M" ]
2KuvWh1
Spatial decision support systems for vehicle routing
The vehicle routing field is a well-developed area of management science application. There is increasing recognition that effective decision-making in this field requires the incorporation of vehicle routing techniques into a decision support system (DSS). In order to provide decision support for a wide range of problems, routing techniques should be combined with systems that can take advantage of new technologies. These include spatial techniques drawn from the field of geographic information systems (GIS). A synthesis of appropriate algorithms and a GIS based computer system is identified as being necessary for effective decision support for the vehicle routing problem.
[ "spatial decision support systems", "decision support systems", "vehicle routing", "geographic information systems" ]
[ "P", "P", "P", "P" ]
169W7t7
An integrated approach to achieving optimal design of computer games
In a time-to-market environment, designers may not be able to incorporate all the design features in a computer game. For each feature, there are several levels of implementation, which is corresponded to different levels of benefit as well as cost. Therefore, a trade-off decision for determining appropriate levels of implementation is very important, yet has been rarely studied in literature. This paper presents an approach to solve the trade-off decision problem. This approach applies the neural network technique and develops a genetic algorithm to optimize the design of computer games. By this approach, a near-optimal design alternative can be identified in a timely fashion.
[ "optimization", "computer game", "neural networks", "genetic algorithms" ]
[ "P", "P", "P", "P" ]
2Z1E8WK
Elliptical Object Detection by a Modified RANSAC with Sampling Constraint from Boundary Curves' Clustering
This paper proposes a method for detecting ellipses from an image despite (1) multiple colors within the ellipses, (2) partially occluded ellipses' boundaries, (3) noisy. locally deformed boundaries of ellipses, (4) presence of multiple objects other than the ellipses in the image, and (5) combinations of (1) through (4). After boundary curves are obtained by edge detection, by utilizing the first-order difference curves of the edge orientation of each pixel in the boundary curves, a segment-reconnect method obtains boundary clusters. Then, a modified RANSAC detects ellipses by choosing five pixels randomly from the boundary clusters, where overlapped ellipses are merged. Experimental results using synthesized images and real images demonstrate the effectiveness of the proposed method together with comparison with the Randomized Hough Transform, a well-known conventional method.
[ "a modified ransac", "ellipse detection" ]
[ "P", "R" ]
3JwQCKS
Supporting End-User Development through a New Composition Model: An Empirical Study
End-user development (EUD) is much hyped, and its impact has outstripped even the most optimistic forecasts. Even so, the vision of end users programming their own solutions has not yet materialized. This will continue to be so unless we in both industry and the research community set ourselves the ambitious challenge of devising end to end an end-user application development model for developing a new age of EUD tools. We have embarked on this venture, and this paper presents the main insights and outcomes of our research and development efforts as part of a number of successful EU research projects. Our proposal not only aims to reshape software engineering to meet the needs of EUD but also to refashion its components as solution building blocks instead of programs and software developments. This way, end users will really be empowered to build solutions based on artefacts akin to their expertise and understanding of ideal solutions.
[ "end-user development", "end-user software engineering", "domain experts", "domain-specific software development", "ecologies of participation" ]
[ "P", "R", "U", "M", "M" ]
2GeoJFr
The asymptotic optimization of pre-edited ANN classifier
The generalization problem of an artificial neural network (ANN) classifier with unlimited size of training sample, namely asymptotic optimization in probability, is discussed in this paper. As an improved ANN network model, the pre-edited ANN classifier shows better practical performance than the standard one. However, it has not been widely applied due to the absence of the related theoretical support. To further promote its application in practice, the asymptotic optimization of the pre-edited ANN classifier is studied in this paper. To help study ANN asymptotic optimization in probability, we gives a review of the previous research works on asymptotic optimization in probability of non-parametric classifier, and grouped the main methods into four classes: two-step method, one-step method, generalization method and hypothesis method. In this paper, we adopt generalization/hypothesis mixed method to prove that pre-edited ANN is asymptotically optimal in probability. Furthermore, a simulation is presented to provide an experimental support for our theoretical work.
[ "asymptotic optimization", "generalization problem", "artificial neural network" ]
[ "P", "P", "P" ]
3aj79VV
A geometric calibration methodology for single-head cone-beam X-ray systems
During X-ray based quality inspection, accurate reconstruction of a 3D object model from a set of its 2D X-ray projections requires efficient geometric calibration, i.e., accurate estimation of the geometric parameters of the setup. We present a calibration methodology for the estimation of the geometric parameters of single-head cone-beam X-ray radiography systems. Our method is related to known approaches regarding camera calibration and geometric calibration of tomography/radiography systems, but performs better in terms of computational efficiency.
[ "geometric calibration", "radiography", "camera calibration", "x-ray inspection", "extrinsic parameters" ]
[ "P", "P", "P", "R", "M" ]
-oAUEhG
Sensing behavior of Al-rich AlN nanotube toward hydrogen cyanide
In order to explore a sensor for detection of toxic hydrogen cyanide (HCN) molecules, interaction of pristine and defected Al-rich aluminum nitride nanotubes (AlNNT) with a HCN molecule has been investigated using density functional theory calculations in terms of energetic, geometric, and electronic properties. It has been found that unlike the pristine AlNNT, the Al-rich AlNNT can effectively interact with the HCN molecule so that its conductivity changes upon the exposure to this molecule. The adsorption energies of HCN on the pristine and defected AlNNTs have been calculated to be in the range of -0.16 to -0.62 eV and -1.75 to -2.21 eV, respectively. We believe that creating Al-rich defects may be a good strategy for improving the sensitivity of these tubes toward HCN molecules, which cannot be trapped and detected by the pristine AlNNT.
[ "nanotube", "sensor", "b3lyp", "dft", "theoretical study" ]
[ "P", "P", "U", "U", "U" ]
-y&Qgzt
A grouping genetic algorithm for the microcell sectorization problem
The number of wireless users has steadily increased over the last decade, leading to the need for methods that efficiently use the limited bandwidth available. Reducing the size of the cells in a cellular network increases the rate of frequency reuse or channel reuse, thus increasing the network capacity. The drawback of this approach is increased costs associated with installation and coordination of the additional base stations. A code-division multiple-access network where the base stations are connected to the central station by fiber has been proposed to reduce the installation costs. To reduce the coordination costs and the number of handoffs, sectorization (grouping) of the cells is suggested. We propose a dynamic sectorization of the cells, depending on the current sectorization and the time-varying traffic. A grouping genetic algorithm is proposed to find a solution which minimizes costs. The computational results demonstrate the effectiveness of the algorithm across a wide range of problems. The GGA is shown to be a useful tool to efficiently allocate the limited number of channels available.
[ "grouping genetic algorithm", "genetic algorithm", "microcell sectorization", "code-division multiple-access", "dynamic channel allocation", "wireless communication networks" ]
[ "P", "P", "P", "P", "R", "M" ]
13D6DZ6
Spatial data mining using fuzzy logic in an object-oriented geographical information database
The Mapping Sciences Section of the Naval Research Laboratory, Stennis Space Center, has realized the enormous benefits of spatial data warehousing and database integration with the implementation of the Geospatial Information Database (GIDB). An object-oriented approach was used to develop an object model that could be easily expanded to include all geographic data types. With the base of object-oriented technology, standards such as Common Object Request Broker Architecture (CORBA) and Virtual Reality Modeling Language (VRML) enabled 2-dimensional as well as 3-dimensional display over the internet. However, in the process of developing the GIDB system, the question of what to do with all the data became an inevitable question. Data exist to be used and exploited by users, but what can users do with all the data? Is the availability of so much information overwhelming to the users? The use of spatial data mining techniques to help users make sense of the wealth of data in the GIDB is the focus of this paper. After general discussions of the topic of spatial data mining, we then present a specific technique for integrating a fuzzy set model for spatial relationship determination with the object-oriented model of the GIDB.
[ "spatial data mining", "object-oriented", "fuzzy spatial relationships" ]
[ "P", "P", "R" ]
hN5a5AV
A generalized Ufnarovski graph
An important tool for studying standard finitely presented algebras is the Ufnarovski graph. In this paper we extend the use of the Ufnarovski graph to automaton algebras, introducing the generalized Ufnarovski graph. As an application, we show how this construction can be used to test Noetherianity of automaton algebras.
[ "ufnarovski graph", "automaton algebras", "noetherianity", "non-commutative algebras", "automata", "dickson's lemma" ]
[ "P", "P", "P", "M", "U", "U" ]
3upx:Du
Autonomic microprocessor execution via self-repairing arrays
To achieve high reliability despite hard faults that occur during operation and to achieve high yield despite defects introduced at fabrication, a microprocessor must be able to tolerate hard faults. In this paper, we present a framework for autonomic self-repair of the array structures in microprocessors (e.g., reorder buffer, instruction window, etc.). The framework consists of three aspects: 1) detecting/diagnosing the fault, 2) recovering from the resultant error, and 3) mapping out the faulty portion of the array. For each aspect, we present design options. Based on this framework, we develop two particular schemes for self-repairing array structures (SRAS). Simulation results show that one of our SRAS schemes adds some performance overhead in the fault-free case, but that both of them mask hard faults 1) with less hardware overhead cost than higher-level redundancy (e.g., IBM mainframes) and 2) without the per-error performance penalty of existing low-cost techniques that combine error detection with pipeline flushes for backward error recovery (BER). When hard faults are present in arrays, due to operational faults or fabrication defects, SRAS schemes outperform BER due to not having to frequently flush the pipeline.
[ "microprocessors", "logic design reliability and testing", "and microcomputers" ]
[ "P", "M", "M" ]
1EnY1Vp
Models and solution techniques for production planning problems with increasing byproducts
We consider a production planning problem where the production process creates a mixture of desirable products and undesirable byproducts. In this production process, at any point in time the fraction of the mixture that is an undesirable byproduct increases monotonically as a function of the cumulative mixture production up to that time. The mathematical formulation of this continuous-time problem is nonconvex. We present a discrete-time mixed-integer nonlinear programming (MINLP) formulation that exploits the increasing nature of the byproduct ratio function. We demonstrate that this new formulation is more accurate than a previously proposed MINLP formulation. We describe three different mixed-integer linear programming (MILP) approximation and relaxation models of this nonconvex MINLP, and we derive modifications that strengthen the linear programming relaxations of these models. We also introduce nonlinear programming formulations to choose piecewise-linear approximations and relaxations of multiple functions that share the same domain and use the same set of break points in the domain. We conclude with computational experiments that demonstrate that the proposed formulation is more accurate than the previous formulation, and that the strengthened MILP approximation and relaxation models can be used to obtain provably near-optimal solutions for large instances of this nonconvex MINLP. Experiments also illustrate the quality of the piecewise-linear approximations produced by our nonlinear programming formulations.
[ "production planning", "mixed integer nonlinear programming", "piecewise linear approximation" ]
[ "P", "M", "M" ]
-M87bbE
Network-on-Chip interconnect for pairing-based cryptographic IP cores
On-chip data traffic in cryptographic circuits often consists of very long words or large groups of smaller words exchanged between processing elements. The resulting wide cross-chip buses exhibit power, congestion and scalability problems. In this paper, two case study cryptographic IP cores with demanding interconnect requirements are implemented on 65nm CMOS. Lightweight, custom bus-replacement Networks-on-Chip (NoCs) have been developed for both cores. Results show that eliminating the 251-bit-wide cross-chip cryptographic buses dramatically improves the quality of physical implementation. The results have applicability to wire-constrained designs in other domains.
[ "network-on-chip", "interconnect", "cryptography", "tate pairing" ]
[ "P", "P", "U", "U" ]
4sFZfeV
Development of 2-DOF tilting actuator with remote center of rotation for human operated non-contact handling tool
This paper describes the development of a new and unique 2-DOF tilting actuator that has a remote center of rotation. The tilting actuator is part of a non-contact handling tool that allows thin contact sensitive objects like silicon wafers or coated sheet metal to be manipulated without any contact between the tool and the object. Tilting is necessary to keep the object aligned with the levitator during large planar accelerations, similar to how a waiter tilts a tray of beverages during transport. A feed-forward tilt implementation based on known acceleration eliminates the need of sensor, but it requires that the center of rotation coincides with the center of mass of the levitated object to avoid disturbances by the tilting action. This is realized by a mechanical solution of a dome-shaped structure that is supported by three ball bearings on the inner surface. The tilting actuator is attached to an in-house developed admittance controlled SCARA-type haptic device, which allows both automated and human operated manipulation. In this collaborative system, where the haptic device assists the human operator in real time, the human operator can successfully perform the manipulation task with ease and without failure, which would not have been possible without the haptic assistance. (C) 2011 Elsevier Ltd. All rights reserved.
[ "tilting", "levitation", "haptic device", "acceleration compensation", "human-machine collaborative system", "pick and place" ]
[ "P", "P", "P", "M", "M", "M" ]
26RXsXj
On the performance anomaly in WiMAX networks
The WiMAX system carries a wide range of services in urban and rural environments supporting quality of service. A key element of the QoS framework is the scheduling algorithm adopted by the Base Station (BS). In this paper, we analyze the saturation throughput perceived by Mobile Stations in the cases of two BS scheduling algorithms: Deficit Round Robin (DRR) and time-based DRR. We demonstrate that the WiFi issue of Performance Anomaly also occurs in WiMAX networks: when the BS uses scheduling approaches aimed at achieving throughput-fairness, like DRR. Performance Anomaly means that when some Mobile Stations (MSs) use a very low bit rate, the throughput of MSs with a high bit rate is dramatically degraded. We propose time-based DRR as a viable solution to remove the Performance Anomaly. Time-based DRR is a simple modification of the DRR algorithm that achieves time-fairness. Its implementation is feasible in WiMAX. The analysis is carried out by means of analytical models supported by NS2 simulations. Two scenarios are considered: the first is suitably set up to highlight and understand the phenomenon of Performance Anomaly; the second examines the impact of Performance Anomaly on a system level focusing on a rural environment. Copyright 2008 John Wiley & Sons, Ltd.
[ "performance anomaly", "wimax", "scheduling", "deficit round robin", "802.16" ]
[ "P", "P", "P", "P", "U" ]
Ph:oCLc
Cryptanalysis of an enhanced timestamp-based password authentication scheme
Recently, Fan proposed an enhanced scheme to improve the security of Yang-Shiehs timestamp-based password authentication scheme. The enhanced scheme can withstand the attacks presented by Chan, Cheng and Fan. In this paper, we show that the enhanced scheme is still insecure. An intruder is able to construct a forged login request by intercepting the legitimate login requests and pass the system authentication with a non-negligible probability.
[ "cryptanalysis", "password authentication", "smart card" ]
[ "P", "P", "U" ]
57TkfSL
Integrating evolution strategies and genetic algorithms with agent-based modeling for flushing a contaminated water distribution system
Water utilities can prepare for water distribution hazards, such as the presence of contaminants in the pipe network and failure of physical components. In contamination events, the complex interactions among managers' operational decisions, consumers' water consumption choices, and the hydraulics and contaminant transport in the water distribution system may influence the contaminant plume so that a typical engineering model may not properly predict public health consequences. A complex adaptive system (CAS) approach couples engineering models of a water distribution system with agent-based models of consumers and public officials. Development of threat management strategies, which prescribe a set of actions to mitigate public health consequences, is enabled through a simulation-optimization framework that couples evolutionary algorithms with the CAS model. Evolution strategies and genetic algorithm-based approaches are developed and compared for an illustrative case study to identify a flushing strategy for opening hydrants to minimize the number of exposed consumers and maintain acceptable levels of service in the network.
[ "agent-based model", "evolutionary algorithms", "sociotechnical systems", "water distribution system security" ]
[ "P", "P", "M", "M" ]
359tvQb
A finite element algorithm for particle/droplet trajectory tracking, tested in a liquid-liquid system in the presence of an external electric field
In this paper we are concerned with modeling a dispersion of electrically charged droplets in motion in a second immiscible liquid phase continuum in the presence of an external electric field. The system is highly relevant to solvent extraction processes and to liquid-liquid reactions such as phase transfer catalysis. Electrostatic enhancement of liquid-liquid contacting processes is well known and relies upon intensified drop breakup and drop acceleration due to the presence of significant electrical forces. The enhanced drop motion results in greatly increased rates of mass transfer. The scope of the current work is the mathematical modeling and experimental validation of the trajectories of multiple charged drops in three-dimensional motion in a liquid-liquid system in the presence of an externally applied field. A new particle tracking algorithm using a tetrahedral finite element mesh to solve the relevant system of differential equations is described in detail. Such elements are shown to be especially convenient for real word simulation because of their simplicity, their high stability with strictly linear interpolation, and their flexibility for dealing with complex computational domains. The calculations of trajectory tracking for several hundred particles taking in a mesh consisting of 50,000 tetrahedral elements was shown only to require a few seconds of computational time using a PC class computer. Using a double step stepping algorithm, with intermediate velocity interpolation, second-order convergence was proven in a rigorous benchmark test. Comparison of simulations of individual drop motion with experimental measurements proved to be very accurate. The full three-dimensional simulation, having greater requirements and involving more degrees of freedom than the earlier two-dimensional case was successfully demonstrated, with greater accuracy and economic use of computational time. Preliminary simulation of swarming motion of droplets using a cloud model is also presented. This shows significant promise for the evaluation of the mass distribution of very small droplets (similar to 1 to 100 mu m) in a real contactor for optimal shape design. The model, when coupled with the models for mass transfer kinetics and interfacial reaction kinetics, should provide a very valuable tool for liquid-liquid reactor design. (c) 2006 Elsevier Ltd. All rights reserved.
[ "liquid-liquid system", "charged drops", "finite element analysis", "trajectory prediction", "swarming drops" ]
[ "P", "P", "M", "M", "R" ]
1nMX3kx
Asymptotical tests in 22 comparative trials: (unconditional approach)
The unconditional Barnard's test for the comparison of two independent proportions is difficult to apply even with moderately large samples. The alternative is to use a ?2 type, arc sine or mid-p asymptotic test. In the paper, the authors evaluate some 60 of these tests, some new and others that are already familiar. For the ordinary significances, the optimal tests are the arc sine methods (with the improvement proposed by Anscombe), the ?2 ones given by Pearson (with a correction for continuity of 2 or of 1 depending on whether the sample sizes are equal or different) and the mid-p-value ones given by Fisher (using the criterion proposed by Armitage, when applied as a two-tailed test). For one-(two) tailed tests, the first method generally produces reliable results E>10.5 (E>9 and unbalanced samples), the second method does so for E>9(E>6) and the third does so for all cases, although for E?6(E?10.5) it usually gives too many conservative results. E refers to the minimum expected quantity.
[ "barnard's test", "mid-p-value", "arc sine transformation", "binomial proportions", "continuity correction", "fisher's exact test", "22 tables", "unconditional test", "validity conditions" ]
[ "P", "P", "M", "M", "R", "M", "M", "R", "U" ]
4F1EoPg
Automation of locality recognition in ADAS plus
In North America, people call the directory assistance operator to find the phone number of a business or residential listing. The directory assistance service is generally maintained by telcos, and it represents a significant cost to them. Partial or complete automation of directory assistance would result in significant cost savings for telcos. Nortel Networks has a product called Automated Directory Assistance System (ADAS) Plus which partially automates this directory assistance function through the use of speech recognition. The system has been deployed all across Quebec, through most of US West and BellSouth. ADAS Plus primarily automates the response to the question for what city? through speech recognition. We give details of this speech recognition system and outline its performance in the deployed regions.
[ "directory assistance", "speech recognition", "directory assistance automation", "city name recognition" ]
[ "P", "P", "R", "M" ]
4tXmeAE
Interval-valued Fuzzy Sets in Soft Computing
In this work, we explain the reasons for which, for some specific problems, interval-valued fuzzy sets must be considered a basic component of Soft Computing.
[ "interval-valued fuzzy set", "fuzzy set", "soft computing", "type-2 fuzzy set" ]
[ "P", "P", "P", "M" ]
k67adrf
Detecting codimension - Two objects in an image with Ginzburg-Landau models
In this paper, we propose a new mathematical model for detecting in an image singularities of codimension greater than or equal to two. This means we want to detect isolated points in a 2-D image or points and curves in a 3-D image. We drew one's inspiration from Ginzburg-Landau (G-L) models which have proved their efficiency for modeling many phenomena in physics. We introduce the model, state its mathematical properties and give some experimental results demonstrating its capability in image processing.
[ "ginzburg-landau model", "points detection", "segmentation", "pde", "biological images", "sar images" ]
[ "P", "R", "U", "U", "M", "M" ]
-k4Ns9x
Counterexamples to the article "stability properties of nonlinear difference equations and conditions for boundedness"
In this paper, we give some counterexamples to the article "Stability properties of nonlinear difference equations and conditions for boundedness", Computers Math. Applic., 38 (2).29-35 (1999). (C) 2003 Elsevier Ltd. All rights reserved.
[ "counterexamples", "difference equations", "period-2 solutions" ]
[ "P", "P", "U" ]
5LTmiCg
Using the parallel algebraic recursive multilevel solver in modern physical applications
This paper discusses the application of a few parallel preconditioning techniques, which are collected in a recently developed suite of codes Parallel Algebraic Recursive Multilevel Solver (pARMS), to tackling large-scale sparse linear systems arising from real-life applications. In particular, we study the effect of different algorithmic variations and parameter choices on the overall performance of the distributed preconditioners in pARMS by means of numerical experiments related to a few realistic applications. These applications include magnetohydrodynamics, nonlinear acoustic field simulation, and tire design.
[ "nonlinear acoustic field simulation", "tire design", "parallel algebraic multilevel preconditioning", "distributed sparse linear systems", "magnetohydrodynamic flows" ]
[ "P", "P", "R", "R", "M" ]
2PXfz-V
A fast spectral element solver combining static condensation and multigrid techniques
We propose a spectral element multigrid method for the two-dimensional Helmholtz equation discretized on regular grids. Combining p -multigrid with static condensation the method achieves nearly linear complexity with an order-independent convergence rate for solving the condensed equations. For smoothing we consider two groups of edge-based relaxation schemes, the best of which attains a multigrid convergence rate of ??0.014 ? ? 0.014 to 0.028. Numerical experiments have been carried out that demonstrate the robustness of the approach for orders up to 32 and a total of 109 degrees of freedom. In comparison with a fast finite difference solver, the latter is clearly outperformed already for errors of one percent or lower.
[ "static condensation", "multigrid method", "spectral element method", "elliptic equations" ]
[ "P", "P", "R", "M" ]
1JbNF&T
Retrieval of images of man-made structures based on projective invariance
In this paper we propose a geometry-based image retrieval scheme that makes use of projectively invariant features. Cross-ratio (CR) is an invariant feature under projective transformations for collinear points. We compute the CRs of point sets in quadruplets and the CR histogram is used as the feature for retrieval purposes. Being a geometric feature, it allows us to retrieve similar images irrespective of view point and illumination changes. We can retrieve the same building even if the facade has undergone a fresh coat of paints! Color and textural features can also be included, if desired. Experimental results show a favorably very good retrieval accuracy when tested on an image database of size 4000. The method is very effective in retrieving images having man-made objects rich in polygonal structures like buildings, rail tracks, etc.
[ "projective invariance", "cross-ratio", "perspective transformation", "cross-ratio histogram", "motif cooccurence matrix", "precision", "recall" ]
[ "P", "P", "M", "R", "U", "U", "U" ]
26hnJ85
Human Pacman: a mobile, wide-area entertainment system based on physical, social, and ubiquitous computing
Human Pacman is a novel interactive entertainment system that ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on mobile computing, wireless LAN, ubiquitous computing, and motion-tracking technologies. Our human Pacman research is a physical role-playing augmented-reality computer fantasy together with real humansocial and mobile gaming. It emphasizes collaboration and competition between players in a wide outdoor physical area which allows natural wide-area humanphysical movements. Pacmen and Ghosts are now real human players in the real world, experiencing mixed computer graphics fantasyreality provided by using the wearable computers. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between real and virtual worlds. We believe human Pacman is pioneering a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.
[ "ubiquitous computing", "collaboration", "wearable computer", "physical interaction", "social computing", "tangible interaction" ]
[ "P", "P", "P", "R", "R", "R" ]
2BifhSS
A solution algorithm for non-convex mixed integer optimization problems with only few continuous variables
We suggest a new method for solving nonlinear mixed-integer programs. We prove convergence of our method. We identify a special case in which the new method finds an exact global optimum in a finite number of iterations. We identify cases in which our method works efficiently. We evaluate the method numerically show that it outperforms standard solvers.
[ "global optimization", "combinatorial optimization", "non-convex optimization", "mixed-integer optimization", "branch-and-bound methods", "facility location problems" ]
[ "R", "M", "R", "R", "M", "M" ]
-W776bj
towards quality metrics for openstreetmap
Volunteered Geographic Information (VGI) is currently a "hot topic" in the GIS community. The OpenStreetMap (OSM) project is one of the most popular and well supported examples of VGL Traditional measures of spatial data quality are often not applicable to OSM as in many cases it is not possible to access ground-truth spatial data for all regions mapped by OSM. We investigate to develop measures of quality for OSM which operate in an unsupervised manner without reference to a "trusted" source of ground-truth data. We provide results of analysis of OSM data from several European countries. The results highlight specific quality issues in OSM. Results of comparing OSM with ground-truth data for Ireland are also presented.
[ "openstreetmap", "shape representation", "quality of spatial data" ]
[ "P", "U", "R" ]
3EXrqCQ
Towards Agile Application Integration with M2M Platforms
M2M (Machine-to-Machine) Technology makes it possible to network all kinds of terminal devices and their corresponding enterprise applications. Therefore, several M2M platforms were developed in China in order to collect information from terminal devices dispersed all over the local places through 3G wireless network. However, when enterprise applications try to integrate with M2M platforms, they should be maintained and refactored to adapt the heterogeneous features and properties of M2M platforms. Moreover, syntactical and semantic unification for information sharing among applications and devices are still unsolved because of raw data transmission and the usage of distinguished business vocabularies. In this paper, we propose and develop an M2M Middleware to support agile application integration with M2M platform. This middleware imports the event engine and XML-based syntax to handle the syntactical unification, makes use of Ontology-based semantic mapping to solve the semantic unification and adopts WebService and ETL techniques to sustain multi-pattern interactive approach, in order to agilely make applications integrated with the M2M platform. Now, the M2M Middleware has been applied in the China Telecom M2M platform. The operation results show that applications will cost less time and workload when being integrated with M2M platform.
[ "m2m", "syntactical and semantic unification", "m2m middleware", "ontology" ]
[ "P", "P", "P", "U" ]
31Z3Y3y
SAS and SPLUS programs to perform Cox regression without convergence problems
When analyzing survival data, the parameter estimates and consequently the relative risk estimates of a Cox model sometimes do not converge to finite values. This phenomenon is due to special conditions in a data set and is known as monotone likelihood. Statistical software packages for Cox regression using the maximum likelihood method cannot appropriately deal with this problem. A new procedure to solve the problem has been proposed by G. Heinze, M. Schemper, A solution to the problem of monotone likelihood in Cox regression, Biometrics 57 (2001). It has been shown that unlike the standard maximum likelihood method, this method always leads to finite parameter estimates. We developed a SAS macro and an SPLUS library to make this method available from within one of these widely used statistical software packages. Our programs are also capable of performing interval estimation based on profile penalized log likelihood (PPL) and of plotting the PPL function as was suggested by G. Heinze, M. Schemper, A solution to the problem of monotone likelihood in Cox regression, Biometrics 57 (2001).
[ "monotone likelihood", "fortran", "nonexistence of parameter estimates", "penalized likelihood", "proportional hazards regression", "survival analysis" ]
[ "P", "U", "M", "R", "M", "M" ]
-CfEMgU
Position estimator for a brushless-DC machine using core saturation and stator current slopes
Purpose - This paper is devoted to the investigation of position estimation for a brushless DC machine using only their stator currents. The first application is for a hybrid electric vehicle, where the generator will be used as a motor to start the internal combustion engine (ICE). Design/methodology/approach - This paper describes how to estimate the rotor position of a brushless DC (BLDC). Two different strategies, both based on stator currents, will be used: one for low speeds to start the ICE, and one for normal speeds for future applications in a pure electric vehicle (EV). The first one uses an estimation method based on core saturation and the second one is based on the determination of the current slopes on two of the three phases. The algorithms proposed neither needs to measure any machine parameters, nor the back emf. The methods use the information contained in the current magnitudes and slopes, and the machine mechanical speed. The system was implemented using a Digital Signal Processor (TMS320F241), which controls the phase currents and makes all the calculations required for position estimation. Additionally, the PWM signals are transmitted through a fiber optic link to minimize noise production and error on commutations. Findings - The papers shows how an internal combustion engine can start using this approach in a brushless motor and keep it synchronized. Research limitations/implications - This work is being applied to a hybrid electric vehicle. Originality/value - The paper proposes a new way to start the internal combustion engine for hybrid vehicle applications through the estimation of the magnet's position. It also shows a way to estimate the position at other speeds for battery charging of the vehicle.
[ "position estimation", "brushless dc and synchronous machines", "sensorless starting torque control", "road vehicles", "electric cells" ]
[ "P", "R", "M", "M", "M" ]
3hcBF&3
Ambiguity-Free Edge-Bundling for Interactive Graph Visualization
Graph visualization has been widely used to understand and present both global structural and local adjacency information in relational data sets (e. g., transportation networks, citation networks, or social networks). Graphs with dense edges, however, are difficult to visualize because fast layout and good clarity are not always easily achieved. When the number of edges is large, edge bundling can be used to improve the clarity, but in many cases, the edges could be still too cluttered to permit correct interpretation of the relations between nodes. In this paper, we present an ambiguity-free edge-bundling method especially for improving local detailed view of a complex graph. Our method makes more efficient use of display space and supports detail-on-demand viewing through an interactive interface. We demonstrate the effectiveness of our method with public coauthorship network data.
[ "graph visualization", "edge bundling", "detail-on-demand", "network visualization", "edge ambiguity", "edge congestion", "interactive navigation" ]
[ "P", "P", "P", "R", "M", "M", "M" ]
3&RPM2r
Iterative processes with errors for nonlinear set-valued variational inclusions involving accretive type mappings
The purpose of this paper is to introduce a new class of nonlinear set-valued variational inclusions in Banach spaces and study the existence of solution and convergence of lshikawa iterative processes with errors for this class of nonlinear set-valued variational inclusions involving accretive type mappings. Our results extend and improve the corresponding results of Chang, Huang and Yuan. (C) 2004 Elsevier Ltd. All rights reserved.
[ "nonlinear set-valued variational inclusion", "convergence", "accretive mapping", "phi-strongly accretive mapping", "weak hausdorff lower semicontinuity", "ishikawa iterative sequence with errors" ]
[ "P", "P", "R", "M", "U", "M" ]
4KS1dhs
Multi-objective genetic algorithms applied to low power pressure microsensor design
Purpose - The purpose of this paper is to explain in detail the optimization of the sensitivity versus the power consumption of a pressure microsensor using multi-objective genetic algorithms. Design/methodology/approach - The tradeoff between sensitivity and power consumption is analyzed and the Pareto frontier is identified by using NSGA-II, AMGA-II and epsilon-MOEA methods. Findings - Comparison results demonstrate that NSGA-II provides optimal solutions over the entire design space for spread metric analysis, and AMGA-II is better for convergence metric analysis. Originality/value - This paper provides a new multiobjective optimization tool for the designers of low power pressure microsensors.
[ "multi-objective", "genetic algorithm", "microsensor", "optimization", "low power optimization", "pressure sensor" ]
[ "P", "P", "P", "P", "R", "M" ]
-DapsiV
New Variational Formulations for Level Set Evolution Without Reinitialization with Applications to Image Segmentation
Interface evolution problems are often solved elegantly by the level set method, which generally requires the time-consuming reinitialization process. In order to avoid reinitialization, we reformulate the variational model as a constrained optimization problem. Then we present an augmented Lagrangian method and a projection Lagrangian method to solve the constrained model and propose two gradient-type algorithms. For the augmented Lagrangian method, we employ the Uzawa scheme to update the Lagrange multiplier. For the projection Lagrangian method, we use the variable splitting technique and get an explicit expression for the Lagrange multiplier. We apply the two approaches to the Chan-Vese model and obtain two efficient alternating iterative algorithms based on the semi-implicit additive operator splitting scheme. Numerical results on various synthetic and real images are provided to compare our methods with two others, which demonstrate effectiveness and efficiency of our algorithms.
[ "reinitialization", "level set method", "augmented lagrangian method", "projection lagrangian method", "chan-vese model", "additive operator splitting" ]
[ "P", "P", "P", "P", "P", "P" ]
-w9&WuH
An iris recognition approach through structural pattern analysis methods
Continuous efforts have been made to improve the robustness of iris coding methods since Daugman's pioneering work on iris recognition was published. Iris recognition is at present used in several scenarios (airport check-in, refugee control etc.) with very satisfactory results. However, in order to achieve acceptable error rates several imaging constraints are enforced, which reduce the fluidity of the iris recognition systems. The majority of the published iris recognition methods follow a statistical pattern recognition paradigm and encode the iris texture information through phase, zero-crossing or texture-analysis based methods. In this paper we propose a method that follows the structural (syntactic) pattern recognition paradigm. In addition to the intrinsic advantages of this type of approach (intuitive description and human perception of the system functioning), our experiments show that the proposed method behaves comparably to the statistical approach that constitutes the basis of nearly all deployed systems.
[ "structural pattern analysis", "noisy iris recognition", "graph matching", "biometrics" ]
[ "P", "M", "U", "U" ]
2LvK&zf
Embedding intelligent planning capability to DEVS models by goal regression method
This paper presents RG-DEVS (ReGression-Discrete EVent System Specification) formalism for embedding intelligent planning capability to DEVS models. RG-DEVS, an extension of classic DEVS, expands the classes of system models that can be represented in DEVS. The goal regression method of the artificial intelligence (AI) production system is exploited for the dynamic generation of the model's sequential states and selection of state transition rules during model execution. Thus, generated states and rules form subgoals and an action plan, respectively. The mechanism for detecting missed execution of the plan and building an amended or new plan for performing corrective actions is also provided. An example application to a transportation vehicle is given to show how RG-DEVS can be applied.
[ "devs", "goal regression", "rg-devs", "simulation", "ai planning", "interdisciplinary research" ]
[ "P", "P", "P", "U", "R", "U" ]
3ELxK:W
Mining Taverna's semantic web of provenance
Taverna is a workflow workbench developed as part of the UK's (my)Grid project. Taverna's provenance model captures both internal provenance locally generated in Taverna and external provenance gathered from third-party data providers. This model also supports overlaying secondary provenance over the primary logs and lineage. This design is motivated by the particular properties of bioinformatics data and services used in Taverna. A Semantic Web of provenance, Ouzo, is built to combine the above different provenance by means of semantic annotations. This paper shows how Ouzo can be mined by a provenance usage component, Provenance Query and Answer (ProQA). ProQA supports provenance retrievals as well as provenance abstraction, aggregation, and semantic reasoning. ProQA is implemented as a suite APIs which can be deployed as provenance services to compose system provenance workflows that analyse experiment results using the provenance records. We show how these features of Taverna's provenance support us in answering the questions from the provenance challenge workshop and a set of additional provenance queries. Copyright (C) 2007 John Wiley & Sons, Ltd.
[ "provenance", "workflow", "semantic annotation" ]
[ "P", "P", "P" ]
287h1G4
DACs: Bringing direct access to variable-length codes
We present a new variable-length encoding scheme for sequences of integers, Directly Addressable Codes (DACs), which enables direct access to any element of the encoded sequence without the need of any sampling method. Our proposal is a kind of implicit data structure that introduces synchronism in the encoded sequence without using asymptotically any extra space. We show some experiments demonstrating that the technique is not only simple, but also competitive in time and space with existing solutions in several applications, such as the representation of LCP arrays or high-order entropy-compressed sequences. (C) 2012 Elsevier Ltd. All rights reserved.
[ "variable length codes", "random access" ]
[ "M", "M" ]
-Mz4nbK
The information structure of indulgent consensus
To solve consensus, distributed systems have to be equipped with oracles such as a failure detector, a leader capability, or a random number generator. For each oracle, various consensus algorithms have been devised. Some of these algorithms are indulgent toward their oracle in the sense that they never violate consensus safety, no matter how the underlying oracle behaves. This paper presents a simple and generic indulgent consensus algorithm that can be instantiated with any specific oracle and be as efficient as any ad hoc consensus algorithm initially devised with that oracle in mind. The key to combining genericity and efficiency is to factor out the information structure of indulgent consensus executions within a new distributed abstraction, which we call "Lambda." Interestingly, identifying this information structure also promotes a fine-grained study of the inherent complexity of indulgent consensus. We show that instantiations of our generic algorithm with specific oracles, or combinations of them, match lower bounds on oracle-efficiency, zero-degradation, and one-step-decision. We show, however, that no leader or failure detector-based consensus algorithm can be, at the same time, zero-degrading and configuration-efficient. Moreover, we show that leader-based consensus algorithms that are oracle-efficient are inherently zero-degrading, but some failure detector-based consensus algorithms can be both oracle-efficient and configuration-efficient. These results highlight some of the fundamental trade offs underlying each oracle.
[ "information structure", "consensus", "asynchronous distributed system", "crash failure", "fault tolerance", "indulgent algorithm", "leader oracle", "modularity", "random oracle", "unreliable failure detector" ]
[ "P", "P", "M", "M", "U", "R", "R", "U", "R", "M" ]
3RyaNUv
From plain character strings to meaningful words: Producing better full text databases for inflectional and compounding languages with morphological analysis software
The paper deals with linguistic processing and retrieval techniques in fulltext databases. Special attention is focused on the characteristics of highly inflectional languages, and how morphological structure of a language should be taken into account, when designing and developing information retrieval systems. Finnish is used as an example of a language, which has a more complicated inflectional structure than the English language. In the FULLTEXT project, natural language analysis modules for Finnish were incorporated into the commercial BASIS information retrieval system, which is based on inverted files and Boolean searching. Several test databases were produced, each using one or two Finnish morphological analysis programs.
[ "morphology", "natural language processing", "full text retrieval", "stemming" ]
[ "P", "R", "R", "U" ]
4jT48M8
Monotonicity Inference for Higher-Order Formulas
Formulas are often monotonic in the sense that satisfiability for a given domain of discourse entails satisfiability for all larger domains. Monotonicity is undecidable in general, but we devised three calculi that infer it in many cases for higher-order logic. The third calculus has been implemented in Isabelle's model finder Nitpick, where it is used both to prune the search space and to soundly interpret infinite types with finite sets, leading to dramatic speed and precision improvements.
[ "higher-order logic", "model finding", "isabelle/hol" ]
[ "P", "M", "M" ]
rkZvecm
improving performance of intrusion detection system by applying a new machine learning strategy
The most acute problem for misuse detection method is its inability to detect new kinds of attacks. A better detection method, which uses a new learning strategy, is proposed to solve this problem. A Concept Hierarchy Generation for attack Labels (CHGL) applying relevant feature subset codes clustering, makes common machine learning algorithms learn attack profiles on high concept levels. And that will enable the system detect more attack instances. Experimental results show the advantage of this new method.
[ "intrusion detection systems", "misuse detection", "clustering", "classification" ]
[ "P", "P", "P", "U" ]
44vZGZb
Inverse optimality in the class of Hopfield neural networks with input nonlinearity
This paper presents the chaos suppression problem in the class of Hopfield neural networks (HNNs) with input nonlinearity using inverse optimality approach. Using the inverse optimality technique and based on Lyapunov stability theory, a stabilizing control law, which is optimal with respect to meaningful cost functional, is determined to achieve global asymptotically stability in the closed-loop system. Numerical simulation is performed on a four-dimensional hyper-chaotic HNN to demonstrate the effectiveness of the proposed method.
[ "inverse optimality", "hopfield neural network", "input nonlinearity", "chaos suppression" ]
[ "P", "P", "P", "P" ]
2N6Eng6
Disambiguating word senses in Korean-Japanese machine translation by using semi-automatically constructed ontology
This paper presents a method for disambiguating word senses in Korean-Japanese machine translation by using a language independent ontology. This ontology stores semantic constraints between concepts and other world knowledge, and enables a natural language processing system to resolve semantic ambiguities by making inferences with the concept network of the ontology. In order to acquire a language-independent and reasonably practical ontology in a limited time and with less manpower, we extend the existing Kadokawa thesaurus by inserting additional semantic relations into its hierarchy, which are classified as case relations and other semantic relations. The former can be obtained by converting valency information and case frames from previously-built electronic dictionaries used in machine translation. The latter can be acquired from concept co-occurrence information, which is extracted automatically from a corpus. In practical machine translation systems, our word sense disambiguation method achieved an improvement of average precision by 6.0% for Japanese analysis and by 9.2% for Korean analysis over the method without using an ontology.
[ "machine translation", "word sense disambiguation", "ontology construction", "ontology representation language", "corpus analysis" ]
[ "P", "P", "R", "M", "R" ]
3owt-x2
PETOOL: MATLAB-based one-way and two-way split-step parabolic equation tool for radiowave propagation over variable terrain
A MATLAB-based one-way and two-way split-step parabolic equation software tool (PETOOL) has been developed with a user-friendly graphical user interface (GUI) for the analysis and visualization of radio-wave propagation over variable terrain and through homogeneous and inhomogeneous atmosphere. The tool has a unique feature over existing one-way parabolic equation (PE)-based codes, because it utilizes the two-way split-step parabolic equation (SSPE) approach with wide-angle propagator, which is a recursive forward-backward algorithm to incorporate both forward and backward waves into the solution in the presence of variable terrain. First, the formulation of the classical one-way SSPE and the relatively-novel two-way SSPE is presented, with particular emphasis on their capabilities and the limitations. Next, the structure and the GUI capabilities of the PETOOL software tool are discussed in detail. The calibration of PETOOL is performed and demonstrated via analytical comparisons and/or representative canonical tests performed against the Geometric Optic (GO) + Uniform Theory of Diffraction (UTD). The tool can be used for research and/or educational purposes to investigate the effects of a variety of user-defined terrain and range-dependent refractivity profiles in electromagnetic wave propagation. Program summary Program title: PETOOL (Parabolic Equation Toolbox) Catalogue identifier: AEJS_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEJS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 143 349 No. of bytes in distributed program, including test data, etc.: 23 280 251 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) 2010a. Partial Differential Toolbox and Curve Fitting Toolbox required Computer: PC Operating system: Windows XP and Vista Classification: 10 Nature of problem: Simulation of radio-wave propagation over variable terrain on the Earth's surface, and through homogeneous and inhomogeneous atmosphere. Solution method: The program implements one-way and two-way Split-Step Parabolic Equation (SSPE) algorithm, with wide-angle propagator. The SSPE is, in general, an initial-value problem starting from a reference range (typically from an antenna), and marching out in range by obtaining the field along the vertical direction at each range step, through the use of step-by-step Fourier transformations. The two-way algorithm incorporates the backward-propagating waves into the standard one-way SSPE by utilizing an iterative forward-backward scheme for modeling multipath effects over a staircase-approximated terrain. Unusual features: This is the first software package implementing a recursive forward-backward SSPE algorithm to account for the multipath effects during radio-wave propagation, and enabling the user to easily analyze and visualize the results of the two-way propagation with GUI capabilities. Running time: Problem dependent. Typically, it is about 1.5 ms (for conducting ground) and 4 ms (for lossy ground) per range step for a vertical field profile of vector length 1500, on Intel Core 2 Duo 1.6 GHz with 2 GB RAM under Windows Vista. (C) 2011 Elsevier B.V. All rights reserved.
[ "petool", "split-step parabolic equation", "refractivity", "multipath effects", "electromagnetic propagation", "ducting", "terrain factors", "validation", "verification and calibration", "matlab program" ]
[ "P", "P", "P", "P", "R", "U", "M", "U", "M", "R" ]
1tPpif&
Central tendency for symmetric random fuzzy numbers ?
Random fuzzy numbers are becoming a valuable tool to model and handle fuzzy-valued data generated through a random process. Recent studies have been devoted to introduce measures of the central tendency of random fuzzy numbers showing a more robust behaviour than the so-called Aumann-type mean value. This paper aims to deepen in the (rather comparative) analysis of these centrality measures and the Aumann-type mean by examining the situation of symmetric random fuzzy numbers. Similarities and differences with the real-valued case are pointed out and theoretical conclusions are accompanied with some illustrative examples.
[ "symmetric random fuzzy number", "random fuzzy number", "fuzzy number", "aumann-type mean of a random fuzzy number", "l1 l 1 medians of a random fuzzy number l1 l 1 l1 l 1 l 1", "symmetric fuzzy number" ]
[ "P", "P", "P", "R", "M", "R" ]
XNmkAxC
Energy-Efficient Double-Edge Triggered Flip-Flop
This paper presents a novel design for a double-edge triggered flip-flop (DETFF). A detailed analysis of the transistors used in the DETFF is carried out to determine the critical path. Therefore, the proposed DETFF employs low-V th transistors at critical paths such that the power-delay product as well as the large area consumption caused by the low-V th transistors can be resolved simultaneously. Therefore, the proposed DETFF fully utilizes the multi-V th scheme provided by advanced CMOS processes without suffering from a large area penalty, slow clock frequency, and poor noise immunity. The proposed design is implemented using a typical 0.18-?m 1P6M CMOS process. The measurement results reveal that the proposed DETFF reduce the power-delay product by at lease 25% (i.e., dissipated energy).
[ "double-edge triggered", "flip-flop", "clocking", "low power", "multiple v th" ]
[ "P", "P", "P", "U", "M" ]
-kK:LSA
performance analysis of high-speed digital buses for multiprocessing systems
Current multiprocessing systems are often organized by connecting several devices with similar characteristics (usually processors) to a common bus. These devices present access with minimal delay; access is controlled by the bus arbitration algorithm. This paper presents a probabilistic analysis of several arbitration algorithms according to several criteria that reflect their relative performances in (1) rendering equal service to all competing devices and (2) allocating available bus bandwidth efficiently. The sensitivity of these criteria to the number of devices on the bus, the speed of the bus, and the distribution of interrequest times is considered. A probabilistic model for the quantitative comparison of these algorithms is constructed in which multiple devices repeatedly issue bus requests at random intervals according to an arbitrary distribution function and are serviced according to one of the algorithms; the devices do not buffer bus requests. The algorithms studied include the static priority, fixed time slice (FTS), two dynamic priority, and first-come, first-served (FCFS) schemes. The performance measures are computed by simulation. The analysis reveals that under heavy bus loads, the dynamic priority and FCFS algorithms offer significantly better performances by these measures than do the static priority and FTS schemes.
[ "performance", "performance analysis", "analysis", "digitize", "systems", "device", "processor", "access", "minimal", "algorithm", "paper", "probabilistic analysis", "rendering", "bandwidth", "sensitive", "distributed", "timing", "probabilistic models", "comparisons", "randomization", "interval", "functional", "buffers", "priorities", "slice", "dynamic", "scheme", "performance measurement", "measurement", "simulation", "relation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U" ]
4cffi6H
Visualization of research fronts and knowledge bases by three-dimensional areal densities of bibliographically coupled publications and co-citations
In this work the well known scientometric concepts of bibliographically coupled publications and co-cited references were applied to produce interactive maps of research fronts and knowledge bases of research fields. This article proposes a method and some standardization for the detection and visualization of research fronts and knowledge bases with two and three dimensional graphics inspired by geographical maps. Agglomerations of bibliographically coupled publications with a common knowledge base are identified and graphically represented by a density function of publications per area unit. The research fronts become visible if publications with similar vectors of common citations are associated and visualized as an ensemble in a three dimensional graphical representation as a mountain scenery measured with the help of a spatial density. Knowledge bases were calculated in the same way. Maps similar to the geographic representation of oceans and islands are used to visualize the two-dimensional spatial density function of references weighted by individual links. The proposed methodology is demonstrated by publications in the field of battery research.
[ "research fronts", "knowledge bases", "bibliographic coupling", "similarity", "spatial density", "battery research", "co-citation analysis", "science mapping", "3d-visualisation", "local paper density", "emerging research fronts", "jaccard index" ]
[ "P", "P", "P", "P", "P", "P", "M", "M", "U", "M", "M", "U" ]
51aP6Zx
Extremal self-dual codes of length 64 through neighbors and covering radii
We construct extremal singly even self-dual [64,32,12] codes with weight enumerators which were not known to be attainable. In particular, we find some codes whose shadows have minimum weight 12. By considering their doubly even neighbors, extremal doubly even self-dual [64,32,12] codes with covering radius 12 are constructed for the first time.
[ "extremal self-dual code", "neighbor", "covering radius" ]
[ "P", "P", "P" ]
-MWDVVu
Convergence and accuracy of the path integral approach for elastostatics
This paper addresses convergence rate and accuracy of a numerical technique for linear elastostatics based on a path integral formulation [Int. J. Numer. Math. Eng. 47 (2000) 1463]. The computational implementation combines a simple polynomial approximation of the displacement field with an approximate statement of the exact evolution equations, which is designated as functional integral method. A convergence analysis is performed for some simple nodal arrays. This is followed by two different estimations of the optimum parameter ?: one is based on statistical arguments and the other on inspection of third order residuals. When the eight closest neighbors to a node are used for polynomial approximation the optimum parameter is found to depend on Poisson's ratio and lie in the range 0.5<?<1.5. Two well established numerical methods are then recovered as specific instances of the FIM. The strong formulationpoint collocationcorresponds to the limit ?=0 while bilinear finite elements corresponds exactly to the choice ?=0.5. The use of the optimum parameter provides better precision than the other two methods with similar computational cost. Other nodal arrays are also studied both in two and three dimensions and the performance of the FIM compared with the corresponding finite element and collocation schemes. Finally, the implementation of FIM on unstructured meshes is discussed, and a numerical example solving Laplace equation is analyzed. It is shown that FIM compares favorably with FEM and offers a number of advantages.
[ "path integral", "meshless methods", "elasticity" ]
[ "P", "M", "U" ]
4PurbLN
Passivity-based output feedback control of Markovian jump systems with discrete and distributed time-varying delays
In this article, we present a new method in designing mode-dependent passivity-based output feedback controllers for Markovian jump systems with time-varying delays. Both discrete and distributed delays are considered in the model. A LyapunovKrasovskii function is constructed to establish new required sufficient conditions for ensuring exponentially mean-square stability and the passivity criteria, simultaneously. The method produces linear matrix inequality formulation that allows obtaining controller gains based on a convex optimisation method. Finally, a numerical example is given to illustrate the effectiveness of our approach.
[ "output feedback control", "markovian jump system", "passivity criteria", "time-delay" ]
[ "P", "P", "P", "U" ]
-pB2KGY
Cytokines and Pregnancy in Rheumatic Disease
Cytokines are important mediators involved in the successful outcome of pregnancy. The concept of pregnancy as biased toward a Th2 immune response states that Th1 type cytokines are associated with pregnancy failure and that Th2 cytokines are protective and counteract pregnancy-related disorders. Studies at the level of the maternalfetal interface, in the maternal circulation and in cells of peripheral blood have shown that the Th2 concept of pregnancy is an oversimplification. Both Th1 and Th2 type cytokines play a role at different stages of pregnancy and are adapted to the localization and function of cells and tissues. The changes of local and systemic cytokine patterns during pregnancy correspond to neuroendocrine changes with hormones as powerful modulators of cytokine expression. Several autoimmune disorders show a modulation of disease activity during and after pregnancy. In rheumatic diseases with a predominance of a Th1 immune response, a shift to a Th2 type immune response during pregnancy has been regarded as beneficial. Studies of pregnant patients with rheumatoid arthritis (RA) and systemic lupus erythematosus (SLE) have shown a cytokine expression similar to that found in healthy pregnant women. Significant differences were present only for a few cytokines and seemed related to the activity of the underlying disease. Interestingly, a gestational increase of cytokine inhibitors interleukin 1 receptor antagonist (IL-1ra) and soluble tumor necrosis factor receptor (sTNFR) in the circulation corresponded to low disease activity in RA. The influence of hormones and cytokines on autoimmune disease is an issue for further study
[ "cytokines", "pregnancy", "rheumatic disease", "th2 immune response", "hormones" ]
[ "P", "P", "P", "P", "P" ]
2GJ9i3L
Platform for real-time subjective assessment of interactive multimedia applications
With the advent of cloud computing and remote execution of interactive applications, there is a need for evaluating the Quality of Experience (QoE) and the influence on this QoE of network condition variations, media encoding parameter settings and related optimization algorithms. However, current QoE assessment focuses mainly on audiovisual quality in non-interactive applications, such as video-on-demand services. On the other hand, where experiments aim to quantify interactive quality, the focus is typically targeted at games, using an ad-hoc test setup to assess the impact of network variations on the playing experience. In this paper, we present a novel platform enabling the assessment of a broad range of interactive applications (e.g., thin client remote desktop systems, remotely rendered engineering applications, games). Dynamic reconfiguration of media encoding and decoding is built into the system, to allow dynamic adaptation of the media encoding to the network conditions and the application characteristics. Evaluating the influence of these automatic adaptations is a key asset of our approach. A range of possible use cases is discussed, as well as a performance study of our implementation, showing that the platform we built is capable of highly controllable subjective user assessment. Furthermore, we present results obtained by applying the platform for a subjective evaluation of an interactive multimedia application. Specifically, the influence of visual quality and frame rate on interactive QoE has been assessed for a remotely executed race game.
[ "interactivity", "quality of experience", "interactive media quality assessment", "subjective quality", "thin client computing" ]
[ "P", "P", "R", "R", "R" ]
3Lq69x:
An Optimal Resource Sharing in Hierarchical Virtual Organizations in the Grid
In large-scale collaborative computing, users and resource providers organize various Virtual Organizations (VOs) to share resources and services. A VO organizes other sub-VOs for the purpose of achieving the VO goal, which forms hierarchical VO environments. VO participants agree upon a certain policies, such as resource sharing amount or user accesses. In this letter, we provide an optimal resource sharing mechanism in hierarchical VO environments under resource sharing agreements. The proposed algorithm enhances resource utilization and reduces mean response time of each user.
[ "resource sharing", "virtual organization", "optimal allocation", "grid computing" ]
[ "P", "P", "M", "R" ]
-Ws6t7Z
One-point Klein codes and their serial-in-serial-out systematic encoding
In this paper, a construction of one-point Klein codes over finite field \(F_q\) with \(q\) a prime power is illustrated. A lot more good one-point Klein codes can be found than good three-point Klein codes in the literature. Many one-point Klein codes over \(F_q\) are near-MDS long codes. Instead of using automorphisms to facilitate the decoding of the Klein codes, automorphisms are used to derive a systematic encoding for the Klein codes via Grbner bases. This systematic encoding promises an efficient serial-in-serial-out hardware architecture for the encoder with considerably less complexity than the brute-force one.
[ "klein codes", "systematic encoding", "grbner bases", "algebraic-geometry codes", "error-correcting codes", "94b05" ]
[ "P", "P", "P", "M", "M", "U" ]
4&rPc&z
A Discrete Particle Swarm Optimizer for Multi-Solution Problems
This letter studies a nesting discrete particle swarm optimizer for multi-solution problems. The algorithm operates in discrete search space and consists of two stages. The first stage is global search in rough lattice points for constructing local sub-regions each of which includes one target solution. The second stage is local search where the algorithm operates in parallel in fine lattice points of local subspaces and tires to find all the approximate solutions within a criterion. We then propose an application to finding multiple fixed points in nonlinear dynamical systems and investigate the algorithm efficiency.
[ "optimization", "multi-solution problems", "swarm intelligence" ]
[ "P", "P", "M" ]
4cxsUf6
types for path correctness of xml queries
If a subexpression in a query will never contribute data to the query answer, this should be regarded as an error. This principle has been recently accepted into mainstream XML query languages, but was still waiting for a complete treatment. We provide here a precise definition for this class of errors, and define a type system that is sound and complete, in its search for such errors, for a core language, under mild restrictions on the use of recursion in type definitions. In the process, we describe a dichotomy among existential and universal type systems, which is useful to understand some unusual features of our type system.
[ "types", "correctness", "xml queries", "queries", "queries", "data", "errors", "query languages", "language", "completeness", "precise", "definition", "class", "type system", "sound", "search", "core", "use", "recursion", "process", "feature", "type correctness", "xml types", "query" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "P" ]
2cNSj53
Data storage and extraction in engineering software using XML
Most engineering analysis programs rely on customized procedures for storing and accessing scientific data. The size of the data set needed to characterize an engineering problem can be substantial, particularly as the problem increases in complexity. This paper presents a methodology for greatly improving the accessibility of scientific data using the Extensible Markup Language (XML) to define a consistent procedure for data storage and retrieval. When the data is expressed in an XML format, any portion of the data set can be easily accessed by interfacing with an XML parser. Examples are provided for automating design calculations and CAD drawings.
[ "software", "xml", "data accessibility", "mathcad", "autocad" ]
[ "P", "P", "R", "U", "U" ]
4yscVVR
Big Bang-Big Crunch optimization for parameter estimation in structural systems
A new approach to parameter estimation of structural systems using the recently developed Big Bang-Big Crunch (BB-BC) optimization is proposed, in which the parameter estimation is formulated as a multi-modal optimization problem with high dimension. The BB-BC method is inspired by one of the theories of the evolution of universe. The potentialities of BB-BC are its inherent numerical simplicity, high convergence speed, and easy implementation. The performances of the proposed method are investigated with simulation results for identifying the parameters of structural systems under conditions including limited output data, noise-polluted signals, and no priori knowledge of mass, damping, or stiffness. It is observed that BB-BC gives comparatively better results than existing methods. Moreover the method is computationally simpler.
[ "parameter estimation", "structural system", "big bang-big crunch (bb-bc) optimization" ]
[ "P", "P", "P" ]
4K43gJ6
a multi-agent system for automated genomic annotation
Massive amounts of raw data are currently being generated by biologists while sequencing organisms. Outside of the largest, high-profile projects such as the Human Genome Project, most of this raw data must be analyzed through the piecemeal application of various computer programs and searches of various public web databases. Due to the inexperience and lack of training, both the raw data and any valuable derived knowledge will remain generally unavailable except in published textual forms. Multi-agent information gathering systems have a lot to contribute to these efforts, even at the current state of the art. We have used DECAF, a multi-agent system toolkit based on RETSINA and TAEMS, to construct a prototype multi-agent system for automated annotation and database storage of sequencing data for herpesviruses. The resulting system eliminates tedious and always out-of-date hand analyses, makes the data and annotations available for other researchers (or agent systems), and provides a level of query processing beyond even some high-profile web sites.
[ "lessons learned from deployed agents", "information agents" ]
[ "M", "R" ]
3U5x8N4
CosmoHammer: Cosmological parameter estimation with the MCMC Hammer ?
We analyse MCMC methods for cosmology regarding parallelisability and efficiency. We present the Python framework CosmoHammer for parallelised MCMC sampling. It enables us to estimate cosmological parameters on high performance clusters. To test the efficiency of CosmoHammer we use an elastic cloud computing environment.
[ "cosmological parameter estimation", "cloud computing", "markov chain monte carlo methods" ]
[ "P", "P", "M" ]
1SnkWJC
evolutionary clustering with arbitrary subspaces
Subspace clustering algorithms in their most general form attempt to describe data with clusters that are not constrained to index a common set of attributes. Previous evolutionary approaches to this problem have assumed a weaker model in which clusters are built in a common subset. Moreover, a filter method is generally assumed in which a classical clustering algorithm is employed in the inner loop. Needless to say, this presents a considerable computational overhead. In this work we recognize the utility of assuming a `bottom-up' approach to subspace clustering. Specifically, we apply a classical clustering algorithm to each attribute to establish 1-d clusters that are then indexed by a MOGA to design a population of subspace clusters. The ensuing search is entirely in terms of a combinatorial optimization problem, thus computationally very efficient. A final single objective GA is then applied to search the set of subspace clusters identified under the MOGA for the most suitable combination.
[ "subspace clustering", "multi-objective genetic algorithm" ]
[ "P", "M" ]
-VN2m62
DC-DC converter for fuel-cells and portable devices in digital CMOS technology
The manuscript discusses the design of an integrated DC-DC power converter in a digital 0.18 mu m CMOS technology for fuel cells and portable applications. By means of a combined boost and switched-capacitor architecture and design optimization a suitable efficiency has been achieved without resorting to special process options and with a limited number of passive external components. The achieved results enable the implementation of a power-converter system for fuel-cell featuring low-cost and small size, as required by the market of portable devices.
[ "dc-dc converters", "fuel-cells", "cmos analog circuits" ]
[ "P", "P", "M" ]
3BHjps7
The PCP Theorem for NP Over the Reals
In this paper we show that the PCP theorem holds as well in the real number computational model introduced by Blum, Shub, and Smale. More precisely, the real number counterpart \(\mathrm{NP}_{{\mathbb {R}}}\) of the classical Turing model class NP can be characterized as \(\mathrm{NP}_{{\mathbb {R}}}= \mathrm{PCP}_{{\mathbb {R}}}(O(\log {n}), O(1))\). Our proof structurally follows the one by Dinur for classical NP. However, a lot of minor and major changes are necessary due to the real numbers as underlying computational structure. The analogue result holds for the complex numbers and \(\mathrm{NP}_{{\mathbb {C}}}\).
[ "pcp theorem", "probabilistically checkable proofs", "real and complex number computations", "hilbert nullstellensatz decision problem", "03d78", "68q15", "68q17", "68q87" ]
[ "P", "M", "R", "U", "U", "U", "U", "U" ]
3wWG&rS
Continuation-based transformations for coordination languages
Coordination languages for parallel and distributed systems specify mechanisms for creating tasks and communicating data among them. These languages typically assume that (a) once a task begins execution on some processor, it will remain resident on that processor throughout its lifetime, and (b) communicating shared data among tasks is through some form of message-passing and data migration. In this paper, we investigate an alternative approach to understanding coordination. Communication-passing style (CmPS) refers to a coordination semantics in which data communication is always undertaken by migrating the continuation of the task requiring the data to the processor where the data resides. Communication-passing style is closely related to continuation-passing style (CPS), a useful transformation for compiling functional languages. Just as CPS eliminates implicit call-realm sequences, CmPS eliminates implicit inter-processor data communication and synchronization requests. In a CmPS-transformed program, only continuations (i.e., control contexts) are trans transmitted across machines: all synchronization and data communication occurs locally. Besides providing significant optimization opportunities, CmPS is a natural representation for implementations on networks of workstations. This paper presents several operational semantics for a coordination language that supports first-class (shared) distributed data repositories. The computation sub-language considered is an untyped call-by-value functional language similar to pure scheme. The first semantics describes a conventional synchronous message-passing implementation; the second is a formulation of a CmPS implementation: and, the third refines this implementation to support computation migration, a technique to lazily migrate control state. Using computation migration, an implementation "distributes" a continuation among multiple machines reducing bandwidth requirements needed to support thread mobility. We prove the equivalence of all three systems and describe optimizations and implementation issues that arise from using a CmPS-driven coordination language. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
[ "continuations", "operational semantics", "distributed programming", "task migration", "tuple-spaces" ]
[ "P", "P", "R", "R", "U" ]
-rZPGde
Performance assessment of modelling tools for high resolution runoff simulation over an industrial site
Intense rainfall can generate storm sewer system failures along with large surface runoff events which represent an issue for industrial sites' security assessment. Numerical modelling tools, including standard bi-dimensional (2D) free surface flow models, are applied in a wide variety of flood risk practical studies straight from the purpose for which they had originally been designed. This study focuses on possibilities, performances and limits of the use of standard modelling tools for high resolution runoff simulations over an industrial site. Two categories of runoff scenarios are tested over this industrial site test case, with three modelling tools relying on different numerical schemes. Simulated water depth evolutions are found to be comparable between modelling tools, nevertheless, the possibilities of these modelling tools' optimal use with a highly refined topographical resolution for runoff scenarios are revealed to be unequal. Used indicators for computation reliability checks do not point out major inconsistencies in calculation under critical models' optimisation. Indeed, emphasis is placed on restrictive aspects to achieve with standard modelling tools a balance between computational stability, swift and precise in high resolution runoff modelling.
[ "computation reliability check", "runoff modelling", "industrial flood risk", "mike", "open foam" ]
[ "P", "P", "R", "U", "U" ]
4ho6j&f
toward a better understanding of tool usage (nier track)
Developers use tools to develop software systems and always alleged better tools are being produced and purchased. Still there have been only limited studies on how people really use tools; these studies have used limited data, and the interactions between tools have not been properly elaborated. The advent of the AISEMA (Automated In-Process Software Engineering Measurement and Analysis) systems [3] has enabled a more detailed collection of tools data. Our "new idea" is to take advantage of such data to build a simple model based on an oriented graph that enables a good understanding on how tools are used individually and collectively. We have empirically validated the model analyzing an industrial team of 19 developers for a period of 10 months.
[ "tool usage" ]
[ "P" ]