title
stringlengths
8
300
abstract
stringlengths
0
10k
Meta-analytic evidence for common and distinct neural networks associated with directly experienced pain and empathy for pain
A growing body of evidence suggests that empathy for pain is underpinned by neural structures that are also involved in the direct experience of pain. In order to assess the consistency of this finding, an image-based meta-analysis of nine independent functional magnetic resonance imaging (fMRI) investigations and a coordinate-based meta-analysis of 32 studies that had investigated empathy for pain using fMRI were conducted. The results indicate that a core network consisting of bilateral anterior insular cortex and medial/anterior cingulate cortex is associated with empathy for pain. Activation in these areas overlaps with activation during directly experienced pain, and we link their involvement to representing global feeling states and the guidance of adaptive behavior for both self- and other-related experiences. Moreover, the image-based analysis demonstrates that depending on the type of experimental paradigm this core network was co-activated with distinct brain regions: While viewing pictures of body parts in painful situations recruited areas underpinning action understanding (inferior parietal/ventral premotor cortices) to a stronger extent, eliciting empathy by means of abstract visual information about the other's affective state more strongly engaged areas associated with inferring and representing mental states of self and other (precuneus, ventral medial prefrontal cortex, superior temporal cortex, and temporo-parietal junction). In addition, only the picture-based paradigms activated somatosensory areas, indicating that previous discrepancies concerning somatosensory activity during empathy for pain might have resulted from differences in experimental paradigms. We conclude that social neuroscience paradigms provide reliable and accurate insights into complex social phenomena such as empathy and that meta-analyses of previous studies are a valuable tool in this endeavor.
Part-Stacked CNN for Fine-Grained Visual Categorization
In the context of fine-grained visual categorization, the ability to interpret models as human-understandable visual manuals is sometimes as important as achieving high classification accuracy. In this paper, we propose a novel Part-Stacked CNN architecture that explicitly explains the finegrained recognition process by modeling subtle differences from object parts. Based on manually-labeled strong part annotations, the proposed architecture consists of a fully convolutional network to locate multiple object parts and a two-stream classification network that encodes object-level and part-level cues simultaneously. By adopting a set of sharing strategies between the computation of multiple object parts, the proposed architecture is very efficient running at 20 frames/sec during inference. Experimental results on the CUB-200-2011 dataset reveal the effectiveness of the proposed architecture, from multiple perspectives of classification accuracy, model interpretability, and efficiency. Being able to provide interpretable recognition results in realtime, the proposed method is believed to be effective in practical applications.
Engineering Self-organising Systems
Nowadays applications are becoming more and more complex, and multi-agent systems are proven an efficient paradigm for implementing this complexity, especially when self-organisation principles are applied. However, designing such self-organising systems becomes an issue: even if many agent-oriented methodologies are provided for developing multi-agent systems, only a few are interested in helping designers when applying self-organisation and emergence principles. This chapter aims at expounding some of them with a special focus on the more mature one, ADELFE.
Stem cell-based cell therapy in neurological diseases: a review.
Human neurological disorders such as Parkinson's disease, Huntington's disease, amyotrophic lateral sclerosis (ALS), Alzheimer's disease, multiple sclerosis (MS), stroke, and spinal cord injury are caused by a loss of neurons and glial cells in the brain or spinal cord. Cell replacement therapy and gene transfer to the diseased or injured brain have provided the basis for the development of potentially powerful new therapeutic strategies for a broad spectrum of human neurological diseases. However, the paucity of suitable cell types for cell replacement therapy in patients suffering from neurological disorders has hampered the development of this promising therapeutic approach. In recent years, neurons and glial cells have successfully been generated from stem cells such as embryonic stem cells, mesenchymal stem cells, and neural stem cells, and extensive efforts by investigators to develop stem cell-based brain transplantation therapies have been carried out. We review here notable experimental and preclinical studies previously published involving stem cell-based cell and gene therapies for Parkinson's disease, Huntington's disease, ALS, Alzheimer's disease, MS, stroke, spinal cord injury, brain tumor, and lysosomal storage diseases and discuss the future prospects for stem cell therapy of neurological disorders in the clinical setting. There are still many obstacles to be overcome before clinical application of cell therapy in neurological disease patients is adopted: 1) it is still uncertain what kind of stem cells would be an ideal source for cellular grafts, and 2) the mechanism by which transplantation of stem cells leads to an enhanced functional recovery and structural reorganization must to be better understood. Steady and solid progress in stem cell research in both basic and preclinical settings should support the hope for development of stem cell-based cell therapies for neurological diseases.
Automatic Texts Summarization: Current State of the Art
To facilitate the task of reading and searching information, it became necessary to find a way to reduce the size of documents without affecting the content. The solution is in Automatic text summarization system, it allows, from an input text to produce another smaller and more condensed without losing relevant data and meaning conveyed by the original text. The research works carried out on this area have experienced lately strong progress especially in English language. However, researches in Arabic text summarization are very few and are still in their beginning. In this paper we expose a literature review of recent techniques and works on automatic text summarization field research, and then we focus our discussion on some works concerning automatic text summarization in some languages. We will discuss also some of the main problems that affect the quality of automatic text summarization systems. © 2015 AESS Publications. All Rights Reserved.
The neutral theory of molecular evolution in the genomic era.
The neutral theory of molecular evolution has been widely accepted and is the guiding principle for studying evolutionary genomics and the molecular basis of phenotypic evolution. Recent data on genomic evolution are generally consistent with the neutral theory. However, many recently published papers claim the detection of positive Darwinian selection via the use of new statistical methods. Examination of these methods has shown that their theoretical bases are not well established and often result in high rates of false-positive and false-negative results. When the deficiencies of these statistical methods are rectified, the results become largely consistent with the neutral theory. At present, genome-wide analyses of natural selection consist of collections of single-locus analyses. However, because phenotypic evolution is controlled by the interaction of many genes, the study of natural selection ought to take such interactions into account. Experimental studies of evolution will also be crucial.
Menopause in type 1 diabetic women: is it premature?
Women with type 1 diabetes have a delayed menarche and a greater prevalence of menstrual disorders than women without diabetes. However, little is known about the menopause transition among type 1 diabetic women. The Familial Autoimmune and Diabetes (FAD) Study recruited both adult individuals who were identified from the Children's Hospital of Pittsburgh Type 1 Diabetes Registry for the years 1950-1964 and their family members. Unrelated nondiabetic control probands and their relatives were also evaluated. Women with type 1 diabetes (n = 143) compared with nondiabetic sisters (n = 186) or unrelated control subjects (n = 160) were more likely to have an older age at menarche (13.5, 12.5, and 12.6 years, respectively, P < 0.001), more menstrual irregularities before 30 years of age (45.7, 33.3, and 33.1%, respectively, P = 0.04), and a younger age at menopause (41.6, 49.9, and 48.0 years, respectively, P = 0.05). This resulted in a 6-year reduction in the number of reproductive years (30.0, 37.0, and 35.2 years, respectively, P = 0.05) for women with type 1 diabetes. Risk factors univariately associated with earlier menopause included type 1 diabetes (hazard ratio [HR] 1.99, P = 0.04), menstrual irregularities before 30 years of age (HR 1.87, P = 0.04), nulliparity (HR 2.14, P = 0.01), and unilateral oophorectomy (HR 6.51, P < 0.0001). Multivariate analysis confirmed that type 1 diabetes (HR 1.98, P = 0.056), menstrual irregularities by 30 years of age (HR 2.36, P = 0.01), and unilateral oophorectomy (HR 9.76, P < 0.0001) were independent determinants of earlier menopause in our cohort. We hypothesize that an earlier menopause, which resulted in a 17% decrease in reproductive years, is a major unstudied complication of type 1 diabetes.
The price of unsustainability: An experiment with professional private equity investors
This paper sheds light on the impact sustainable and unsustainable corporate practices have on equity financing. We present a unique framed field experiment in which professional private equity investors competed in closed auctions to acquire fictive firms. We hence observe that corporate non-financial performance impacts firm valuation and investment decision and we quantify to which extent. Main result is an asymmetric effect, entrepreneurs having more to lose from unsustainable practices than to gain from sustainable ones. Our findings are discussed in terms of practical implications for both investors and firm managers.
Distributed denial of service (DDoS) resilience in cloud: Review and conceptual cloud DDoS mitigation framework
Despite the increasing popularity of cloud services, ensuring the security and availability of data, resources and services remains an ongoing research challenge. Distributed denial of service (DDoS) attacks are not a new threat, but remain a major security challenge and are a topic of ongoing research interest. Mitigating DDoS attack in cloud presents a new dimension to solutions proffered in traditional computing due to its architecture and features. This paper reviews 96 publications on DDoS attack and defense approaches in cloud computing published between January 2009 and December 2015, and discusses existing research trends. A taxonomy and a conceptual cloud DDoS mitigation framework based on change point detection are presented. Future research directions are also outlined.
Toxicity of Hexavalent Chromium and Its Reduction by Bacteria Isolated from Soil Contaminated with Tannery Waste
An Arthrobacter sp. and a Bacillus sp., isolated from a long-term tannery waste contaminated soil, were examined for their tolerance to hexavalent chromium [Cr(VI)] and their ability to reduce Cr(VI) to Cr(III), a detoxification process in cell suspensions and cell extracts. Both bacteria tolerated Cr(VI) at 100 mg/ml on a minimal salts agar medium supplemented with 0.5% glucose, but only Arthrobacter could grow in liquid medium at this concentration. Arthrobacter sp. could reduce Cr(VI) up to 50 μg/ml, while Bacillus sp. was not able to reduce Cr(VI) beyond 20 μg/ml. Arthrobacter sp. was distinctly superior to the Bacillus sp. in terms of their Cr(VI)-reducing ability and resistance to Cr(VI). Assays with permeabilized (treated with toluene or Triton X 100) cells and crude extracts demonstrated that the Cr(VI) reduction was mainly associated with the soluble protein fraction of the cell. Arthrobacter sp. has a great potential for bioremediation of Cr(VI)-containing waste.
OpinionFinder: A System for Subjectivity Analysis
Vancouver, October 2005. OpinionFinder: A system for subjectivity analysis Theresa Wilson‡, Paul Hoffmann‡, Swapna Somasundaran†, Jason Kessler†, Janyce Wiebe†‡, Yejin Choi§, Claire Cardie§, Ellen Riloff∗, Siddharth Patwardhan∗ ‡Intelligent Systems Program, University of Pittsburgh, Pittsburgh, PA 15260 †Department of Computer Science, University of Pittsburgh, Pittsburgh, PA 15260 §Department of Computer Science, Cornell University, Ithaca, NY 14853 ∗School of Computing, University of Utah, Salt Lake City, UT 84112
A dynamic modularity based community detection algorithm for large-scale networks: DSLM
In this work, a new fast dynamic community detection algorithm for large scale networks is presented. Most of the previous community detection algorithms are designed for static networks. However, large scale social networks are dynamic and evolve frequently over time. To quickly detect communities in dynamic large scale networks, we proposed dynamic modularity optimizer framework (DMO) that is constructed by modifying well-known static modularity based community detection algorithm. The proposed framework is tested using several different datasets. According to our results, community detection algorithms in the proposed framework perform better than static algorithms when large scale dynamic networks are considered.
Analysis of Retinal Vessel Segmentation with Deep Learning and its Effect on Diabetic Retinopathy Classification
The success of deep learning methodologies draws a huge attention to their applications in medical image analysis. One of the applications of deep learning is in segmentation of retinal vessel and severity classification of diabetic retinopathy (DR) from retinal funduscopic image. This paper studies U-Net model performance in segmenting retinal vessel with different settings of dropout and batch normalization and use it to investigate the effect of retina vessel in DR classification. Pre-trained Inception V1 network was used to classify the DR severity. Two sets of retinal images, with and without the presence of vessel, were created from MESSIDOR dataset. The vessel extraction process was done using the best trained U-Net on DRIVE dataset. Final analysis showed that retinal vessel is a good feature in classifying both severe and early cases of DR stage.
A survey of the sorghum transcriptome using single-molecule long reads
Alternative splicing and alternative polyadenylation (APA) of pre-mRNAs greatly contribute to transcriptome diversity, coding capacity of a genome and gene regulatory mechanisms in eukaryotes. Second-generation sequencing technologies have been extensively used to analyse transcriptomes. However, a major limitation of short-read data is that it is difficult to accurately predict full-length splice isoforms. Here we sequenced the sorghum transcriptome using Pacific Biosciences single-molecule real-time long-read isoform sequencing and developed a pipeline called TAPIS (Transcriptome Analysis Pipeline for Isoform Sequencing) to identify full-length splice isoforms and APA sites. Our analysis reveals transcriptome-wide full-length isoforms at an unprecedented scale with over 11,000 novel splice isoforms. Additionally, we uncover APA of ∼11,000 expressed genes and more than 2,100 novel genes. These results greatly enhance sorghum gene annotations and aid in studying gene regulation in this important bioenergy crop. The TAPIS pipeline will serve as a useful tool to analyse Iso-Seq data from any organism.
Algebraic tensegrity form-finding
This paper concerns the form-finding problem for general and symmetric tensegrity structures with shape constraints. A number of different geometries are treated and several fundamental properties of tensegrity structures are identified that simplify the form-finding problem. The concept of a tensegrity invariance (similarity) transformation is defined and it is shown that tensegrity equilibrium is preserved under affine node position transformations. This result provides the basis for a new tensegrity form-finding tool. The generality of the problem formulation makes it suitable for the automated generation of the equations and their derivatives. State-of-the-art numerical algorithms are applied to solve several example problems. Examples are given for tensegrity plates, shell-class symmetric tensegrity structures and structures generated by applying similarity transformation. 2005 Elsevier Ltd. All rights reserved.
Hardware compilation of application-specific memory-access interconnect
A major obstacle to successful high-level synthesis (HLS) of large-scale application-specified integrated circuit systems is the presence of memory accesses to a shared-memory subsystem. The latency to access memory is often not statically predictable, which creates problems for scheduling operations dependent on memory reads. More fundamental is that dependences between accesses may not be statically provable (e.g., if the specification language permits pointers), which introduces memory-consistency problems. Addressing these issues with static scheduling results in overly conservative circuits, and thus, most state-of-the-art HLS tools limit memory systems to those that have predictable latencies and limit programmers to specifications that forbid arbitrary memory-reference patterns. A new HLS framework for the synthesis and optimization of memory accesses (SOMA) is presented. SOMA enables specifications to include arbitrary memory references (e.g., pointers) and allows the memory system to incorporate features that might cause the latency of a memory access to vary dynamically. This results in raising the level of abstraction in the input specification, enabling faster design times. SOMA synthesizes a memory access network (MAN) architecture that facilitates dynamic scheduling and ordering of memory accesses. The paper describes a basic MAN construction technique that illustrates how dynamic ordering helps in efficiently maintaining memory consistency and how dynamic scheduling helps alleviate the variable-latency problem. Then, it is shown how static analysis of the access patterns can be used to optimize the MAN. One optimization changes the MAN interconnect topology to increase concurrence. A second optimization reduces the synchronization overhead necessary to maintain memory consistency. Postlayout experiments demonstrate that SOMA's application-specific MAN construction significantly improves power and performance for a range of benchmarks.
Bridging Physical and Virtual Worlds with Electronic Tags
The role of computers in the modern office has divided ouractivities between virtual interactions in the realm of thecomputer and physical interactions with real objects within thetraditional office infrastructure. This paper extends previous workthat has attempted to bridge this gap, to connect physical objectswith virtual representations or computational functionality, viavarious types of tags. We discuss a variety of scenarios we haveimplemented using a novel combination of inexpensive, unobtrusiveand easy to use RFID tags, tag readers, portable computers andwireless networking. This novel combination demonstrates theutility of invisibly, seamlessly and portably linking physicalobjects to networked electronic services and actions that arenaturally associated with their form.
A Conversation About Radicalism in Contemporary Greece
B ecause of its economic issues, Greece has been often in the news during the past several years. But the news often offers only superficial accounts of what is happening in Greek society. In the summer of 2015, on a warm day in Athens, I joined Dr. Vassiliki Georgiadou (VG), (political science, Panteion University); Dr. Lamprini Rori (LR), a Marie Curie fellow (Bournemouth University); and Dr. Despina Papadimitriou (DP), (history, Panteion University) in a faculty office in Panteion University to talk about radicalism in contemporary Greek society and how various kinds of radicalism are related to economic, social, and political upheaval. As research collaborators and coauthors, their recent research, particularly that of Dr. Georgiadou, has focused on the right-wing group Chryssa Avgi, or Golden Dawn, and in the following conversation they discuss not only this group but also their broader assessment of the connections between political and economic unrest, and they offer some surprising observations about radicalism of the left and the right “on the ground” in contemporary Greece. ARTHUR VERSLUIS
How Can Edge Computing Benefit From Software-Defined Networking: A Survey, Use Cases, and Future Directions
A novel paradigm that changes the scene for the modern communication and computation systems is the Edge Computing. It is not a coincidence that terms like Mobile Cloud Computing, Cloudlets, Fog Computing, and Mobile-Edge Computing are gaining popularity both in academia and industry. In this paper, we embrace all these terms under the umbrella concept of “Edge Computing” to name the trend where computational infrastructures hence the services themselves are getting closer to the end user. However, we observe that bringing computational infrastructures to the proximity of the user does not magically solve all technical challenges. Moreover, it creates complexities of its own when not carefully handled. In this paper, these challenges are discussed in depth and categorically analyzed. As a solution direction, we propose that another major trend in networking, namely software-defined networking (SDN), should be taken into account. SDN, which is not proposed specifically for Edge Computing, can in fact serve as an enabler to lower the complexity barriers involved and let the real potential of Edge Computing be achieved. To fully demonstrate our ideas, initially, we put forward a clear collaboration model for the SDN-Edge Computing interaction through practical architectures and show that SDN related mechanisms can feasibly operate within the Edge Computing infrastructures. Then, we provide a detailed survey of the approaches that comprise the Edge Computing domain. A comparative discussion elaborates on where these technologies meet as well as how they differ. Later, we discuss the capabilities of SDN and align them with the technical shortcomings of Edge Computing implementations. We thoroughly investigate the possible modes of operation and interaction between the aforementioned technologies in all directions and technically deduce a set of “Benefit Areas” which is discussed in detail. Lastly, as SDN is an evolving technology, we give the future directions for enhancing the SDN development so that it can take this collaboration to a further level.
A stochastic model of human-machine interaction for learning dialog strategies
In this paper, we propose a quantitative model for dialog systems that can be used for learning the dialog strategy. We claim that the problem of dialog design can be formalized as an optimization problem with an objective function reflecting different dialog dimensions relevant for a given application. We also show that any dialog system can be formally described as a sequential decision process in terms of its state space, action set, and strategy. With additional assumptions about the state transition probabilities and cost assignment, a dialog system can be mapped to a stochastic model known as Markov decision process(MDP). A variety of data driven algorithms for finding the optimal strategy (i.e., the one that optimizes the criterion) is available within the MDP framework, based on reinforcement learning. For an effective use of the available training data we propose a combination of supervised and reinforcement learning: the supervised learning is used to estimate a model of the user, i.e., the MDP parameters that quantify the user’s behavior. Then a reinforcement learning algorithm is used to estimate the optimal strategy while the system interacts with the simulated user. This approach is tested for learning the strategy in an air travel information system (ATIS) task. The experimental results we present in this paper show that it is indeed possible to find a simple criterion, a state space representation, and a simulated user parameterization in order to automatically learn a relatively complex dialog behavior, similar to one that was heuristically designed by several research groups.
Coherent-array imaging using phased subarrays. Part I: basic principles
The front-end hardware complexity of a coherent array imaging system scales with the number of active array elements that are simultaneously used for transmission or reception of signals. Different imaging methods use different numbers of active channels and data collection strategies. Conventional full phased array (FPA) imaging produces the best image quality using all elements for both transmission and reception, and it has high front-end hardware complexity. In contrast, classical synthetic aperture (CSA) imaging only transmits on and receives from a single element at a time, minimizing the hardware complexity but achieving poor image quality. We propose a new coherent array imaging method - phased subarray (PSA) imaging - that performs partial transmit and receive beam-forming using a subset of adjacent elements at each firing step. This method reduces the number of active channels to the number of subarray elements; these channels are multiplexed across the full array and a reduced number of beams are acquired from each subarray. The low-resolution subarray images are laterally upsampled, interpolated, weighted, and coherently summed to form the final high-resolution PSA image. The PSA imaging reduces the complexity of the front-end hardware while achieving image quality approaching that of FPA imaging
Color image enhancement using retinex with robust envelope
In this paper, we propose a color image enhancement method that uses retinex with a robust envelope to improve the visual appearance of an image. The word “retinex” is hybird “retina” and “cortex”, suggesting that human visual perception is involved in this color image enhancement. To avoid the gray-world violation, a color-shifting problem, an input RGB color image is transformed into an HVS color image, but only the V component is enhanced. Furthermore, to prevent hallow artifacts, we construct a robust envelope with a gradient-dependent weighting to limit disturbances around intensity gaps such as edges and corners. Our experiment results show that the proposed method yields a better (almost hallow-free) performance than traditional image enhancement methods.
Percutaneous left atrial appendage closure for stroke prophylaxis in patients with atrial fibrillation: 2.3-Year Follow-up of the PROTECT AF (Watchman Left Atrial Appendage System for Embolic Protection in Patients with Atrial Fibrillation) Trial.
BACKGROUND The multicenter PROTECT AF study (Watchman Left Atrial Appendage System for Embolic Protection in Patients With Atrial Fibrillation) was conducted to determine whether percutaneous left atrial appendage closure with a filter device (Watchman) was noninferior to warfarin for stroke prevention in atrial fibrillation. METHODS AND RESULTS Patients (n=707) with nonvalvular atrial fibrillation and at least 1 risk factor (age >75 years, hypertension, heart failure, diabetes, or prior stroke/transient ischemic attack) were randomized to either the Watchman device (n=463) or continued warfarin (n=244) in a 2:1 ratio. After device implantation, warfarin was continued for ≈45 days, followed by clopidogrel for 4.5 months and lifelong aspirin. Study discontinuation rates were 15.3% (71/463) and 22.5% (55/244) for the Watchman and warfarin groups, respectively. The time in therapeutic range for the warfarin group was 66%. The composite primary efficacy end point included stroke, systemic embolism, and cardiovascular death, and the primary analysis was by intention to treat. After 1588 patient-years of follow-up (mean 2.3±1.1 years), the primary efficacy event rates were 3.0% and 4.3% (percent per 100 patient-years) in the Watchman and warfarin groups, respectively (relative risk, 0.71; 95% confidence interval, 0.44%-1.30% per year), which met the criteria for noninferiority (probability of noninferiority >0.999). There were more primary safety events in the Watchman group (5.5% per year; 95% confidence interval, 4.2%-7.1% per year) than in the control group (3.6% per year; 95% confidence interval, 2.2%-5.3% per year; relative risk, 1.53; 95% confidence interval, 0.95-2.70). CONCLUSIONS The "local" strategy of left atrial appendage closure is noninferior to "systemic" anticoagulation with warfarin. PROTECT AF has, for the first time, implicated the left atrial appendage in the pathogenesis of stroke in atrial fibrillation. CLINICAL TRIAL REGISTRATION : URL: http://www.clinicaltrials.gov. Unique identifier: NCT00129545.
IL-6–174G/C genotype is associated with the bone mineral density response to oestrogen replacement therapy in post-menopausal women
A reduction in interleukin-6 (IL-6) activity may contribute to the beneficial effects of hormone replacement therapy (HRT) on the menopausal decline in bone mineral density (BMD). We have examined this hypothesis using a genetic strategy. The –174C (rather than G) IL-6 gene variant is associated with lower IL-6 expression. As such, we might anticipate the C allele to be associated with a greater response to HRT. We have tested this hypothesis. Mean three-site [spine (L1-L4), neck of femur, and Ward’s triangle] BMD was measured in 65 women in a 1-year randomised controlled trial of HRT with 0.625 mg oestrogen/day and 0.15 mg norgestrel (n=30). Baseline BMD was genotype-independent for both the control and HRT group. In the control group, the percentage change in BMD after 1 year was similar between genotypes (P=0.45). In contrast, in the HRT group, the rise was genotype-dependent. Those homozygous for the G allele showed a 3.62 (2.14)% increase in BMD compared with 10.44 (4.68)% for the C-homozygous group. Heterozygotes had an intermediate BMD increase of 5.6 (2.82)% [P=0.006 (P value for interaction between HRT and genotype was 0.04)] Although the study was limited by its small sample size, these are the first data to demonstrate the importance of IL-6 genotype in determining response to oestrogen therapy, rather than its physiological withdrawal.
Adversarial Text Generation Without Reinforcement Learning
Generative Adversarial Networks (GANs) have experienced a recent surge in popularity, performing competitively in a variety of tasks, especially in computer vision. However, GAN training has shown limited success in natural language processing. This is largely because sequences of text are discrete, and thus gradients cannot propagate from the discriminator to the generator. Recent solutions use reinforcement learning to propagate approximate gradients to the generator, but this is inefficient to train. We propose to utilize an autoencoder to learn a low-dimensional representation of sentences. A GAN is then trained to generate its own vectors in this space, which decode to realistic utterances. We report both random and interpolated samples from the generator. Visualization of sentence vectors indicate our model correctly learns the latent space of the autoencoder. Both human ratings and BLEU scores show that our model generates realistic text against competitive baselines.
Physics 101: Learning Physical Object Properties from Unlabeled Videos
We study the problem of learning physical properties of objects from unlabeled videos. Humans can learn basic physical laws when they are very young, which suggests that such tasks may be important goals for computational vision systems. We consider various scenarios: objects sliding down an inclined surface and colliding; objects attached to a spring; objects falling onto various surfaces, etc. Many physical properties like mass, density, and coefficient of restitution influence the outcome of these scenarios, and our goal is to recover them automatically. We have collected 17,408 video clips containing 101 objects of various materials and appearances (shapes, colors, and sizes). Together, they form a dataset, named Physics 101, for studying object-centered physical properties. We propose an unsupervised representation learning model, which explicitly encodes basic physical laws into the structure and use them, with automatically discovered observations from videos, as supervision. Experiments demonstrate that our model can learn physical properties of objects from video. We also illustrate how its generative nature enables solving other tasks such as outcome prediction.
Multiple-dose safety, tolerability, and pharmacokinetics of oral nemonoxacin (TG-873870) in healthy volunteers.
Nemonoxacin (TG-873870) is a novel nonfluorinated quinolone with broad-spectrum activities against Gram-positive and Gram-negative aerobic, anaerobic, and atypical pathogens, as well as against methicillin-resistant Staphylococcus aureus, vancomycin-resistant S. aureus, and multiple-resistant bacterial pathogens. We conducted a randomized, double-blind, placebo-controlled, dose-escalating study to ascertain the safety, tolerability, and pharmacokinetics of nemonoxacin. We enrolled 46 healthy volunteers and used a once-daily oral-dosing range of 75 to 1,000 mg for 10 days. Additionally, the food effect was evaluated in subjects in the 500-mg cohort. Nemonoxacin was generally safe and well tolerated, with no significant changes in the clinical laboratory tests or electrocardiograms. Adverse effects, including headache, contact dermatitis, and rash, were mild and resolved spontaneously. Nemonoxacin was rapidly absorbed within 2 h postdosing, and generally, a steady state was reached after 3 days. The maximum plasma concentration and the area under the plasma concentration-time curve were dose proportional over the dosing range. The elimination half-life was approximately 7.5 h and 19.7 h on days 1 and 10, respectively. Approximately 37 to 58% of the drug was excreted in the urine. Food affected the pharmacokinetics, with decreases in the maximum plasma concentration and area under the plasma concentration-time curve of 46% and 27%, respectively. However, the free AUC/MIC(90) of nemonoxacin was more than 100 under both the fasting and fed conditions, predicting the efficacy of nemonoxacin against most of the tested pathogens. In conclusion, the results support further clinical investigation of once-daily nemonoxacin administration for antibiotic-sensitive and antibiotic-resistant bacterial infections.
Increasing X-ray image interpretation competency of cargo security screeners
X-ray screening of containers and unit load devices in the area of cargo shipping is becoming an essential and common feature at ports and airports all over the world. The detection of prohibited items in X-ray images is a challenging task for screening officers as they need to know which items are prohibited and what they look like in X-ray images. The main aim of this study was to investigate whether X-ray image interpretation competency of cargo security screeners can be increased by computer-based training. More specifically, effects of training were investigated by conducting tests before training started and after approximately three months of training. Moreover, it was examined whether viewing X-ray images in pseudo color would lead to a better detection performance compared to when X-ray images are shown in greyscale. Recurrent computer-based training resulted in large performance increases after three months. No significant difference in detection performance could be found for tests when using X-ray images in greyscale vs. pseudo color. Relevance to industry: Cargo X-ray screening is becoming a common feature at ports and airports. The identification and detection of prohibited items in X-ray images highly depends on human operators and their competences regarding X-ray image interpretation. Thus, research on appropriate training methods and enhancements of the human factor are essential to achieve and maintain high levels of security. 2014 Elsevier B.V. All rights reserved.
Quality improvement guidelines for percutaneous drainage/aspiration of abscess and fluid collections.
THE membership of the Society of Interventional Radiology (SIR) Standards of Practice Committee represents experts in a broad spectrum of interventional procedures from both the private and academic sectors of medicine. Generally Standards of Practice Committee members dedicate the vast majority of their professional time to performing interventional procedures; as such they represent a valid broad expert constituency of the subject matter under consideration for standards production. Technical documents specifying the exact consensus and literature review
Combining Multiple Coverage Criteria in Search-Based Unit Test Generation
Automated test generation techniques typically aim at maximising coverage of well-established structural criteria such as statement or branch coverage. In practice, generating tests only for one specific criterion may not be sufficient when testing object oriented classes, as standard structural coverage criteria do not fully capture the properties developers may desire of their unit test suites. For example, covering a large number of statements could be easily achieved by just calling the main method of a class; yet, a good unit test suite would consist of smaller unit tests invoking individual methods, and checking return values and states with test assertions. There are several different properties that test suites should exhibit, and a search-based test generator could easily be extended with additional fitness functions to capture these properties. However, does search-based testing scale to combinations of multiple criteria, and what is the effect on the size and coverage of the resulting test suites? To answer these questions, we extended the EvoSuite unit test generation tool to support combinations of multiple test criteria, defined and implemented several different criteria, and applied combinations of criteria to a sample of 650 open source Java classes. Our experiments suggest that optimising for several criteria at the same time is feasible without increasing computational costs: When combining nine different criteria, we observed an average decrease of only 0.4% for the constituent coverage criteria, while the test suites may grow up to 70%.
Direct Torque Control of Induction Motors Utilizing Three-Level Voltage Source Inverters
A new control strategy for induction motors based on direct torque control is presented which employs a three-level inverter instead of the standard two-level inverter. The controller is designed to achieve a torque ripple reduction by taking advantage of the increase in the number of inverter states available in a three-level inverter. The harmonic distortion in the stator currents and the switching frequency of the semiconductor devices are also reduced in the new control system presented.
Long-term follow-up results of IFM99-03 and IFM99-04 trials comparing nonmyeloablative allotransplantation with autologous transplantation in high-risk de novo multiple myeloma.
Recent pilot studies combining a cytoreductive autologous stem cell transplantation (ASCT) with a reduced-intensity conditioning regimen (RIC) allograft have reported encouraging results in patients with de novo multiple myeloma (MM).1,2 However, it remains to be determined whether single autograft followed by RIC allograft approaches are superior to double ASCT programs. Until now, only 3 prospective studies comparing the combination of ASCT followed by RIC allograft with tandem ASCT have been reported, one from the Intergroupe Francophone du Myélome (IFM),3 one from Italy,4 and one from Spain.5 The Italian group reported a significant survival advantage in favor of allo-RIC, both for event-free survival (EFS) and overall survival (OS).4 In the Spanish trial, there was a trend toward a longer progression-free survival in favor of allo-RIC, but both EFS and OS were not significantly different between second ASCT and allo-RIC.5 Here we report the updated results of the IFM study.3 At the reference date of July 1, 2008, on an intent-to-treat basis and considering the entire population of 284 patients with a median follow-up of 56 months, the EFS did not significantly differ from tandem ASCT to single autograft followed by allo-RIC (median 22 vs 19 months, P .58). Nevertheless, there was a trend for a superior OS in the double ASCT trial (median 48 vs 34 months, P .07; see Figure 1). When considering the comparison of the results of the 166 patients out of 219 who completed the whole tandem ASCT protocol with those of the 46 patients out of 65 who underwent the entire auto/allo-RIC program, no difference was observed regarding EFS (median 25 vs 21 months, P .88), but there was again a trend for a superior OS in favor of double ASCT (median OS, 57 vs 41 months, P .08), due to a longer survival after relapse in the tandem ASCT arm. Our results differ significantly from those reported by Bruno4 and Rosinol.5 This can be explained through the different study design. Our study focused on patients with high-risk disease, that is, elevated 2-microglobulin plus chromosome 13 abnormalities, and the conditioning regimen before allo-RIC consisted of busulfan, fludarabin and high-dose ATG, possibly eliminating part of the GVM effect.6,7 In the Italian trial,4 all patients irrespective of the prognostic factors were included, the conditioning regimen before allo-RIC consisted of 2 Gy TBI, and the results of the tandem ASCT arm were surprisingly poor, with a median OS of 58 months in patients who completed the double autograft protocol, which is clearly inferior to the results of other recently reported series of double ASCT.8,9 In the recent Spanish trial, only chemosensitive patients failing to achieve complete or near-complete response after a first ASCT were treated with either a second ASCT or an allo-RIC based on the availability of an HLA-identical sibling donor.5 Moreover, in this study the number of patients in the allogroup was small and the preparative regimens for autotransplant patients were not uniform. Our long-term results may indicate that, in a subgroup of high-risk patients with de novo MM, a tandem ASCT procedure is at least equivalent or even superior to a combination of autologous followed by RIC allogeneic SCT. In patients with standard-risk and/or chemosensitive disease RIC allograft could be an interesting option. Results of 2 recently completed prospective phase 3 trials in North America and Europe comparing double ASCT with single ASCT followed by nonmyeloablative allogeneic SCT are eagerly awaited.
Hard Drive Side-Channel Attacks Using Smartphone Magnetic Field Sensors
In this paper we present a new class of side-channel attacks on computer hard drives. Hard drives contain one or more spinning disks made of a magnetic material. In addition, they contain different magnets which rapidly move the head to a target position on the disk to perform a write or a read. The magnetic fields from the disk’s material and head are weak and well shielded. However, we show that the magnetic field due to the moving head can be picked up by sensors outside of the hard drive. With these measurements, we are able to deduce patterns about ongoing operations. For example, we can detect what type of the operating system is booting up or what application is being started. Most importantly, no special equipment is necessary. All attacks can be performed by using an unmodified smartphone placed in proximity of a hard drive.
Vitamin D, d-dimer, Interferon γ, and sCD14 Levels are Independently Associated with Immune Reconstitution Inflammatory Syndrome: A Prospective, International Study☆
To determine the immunological profile most important for IRIS prediction, we evaluated 20 baseline plasma biomarkers in Acquired Immunodeficiency Syndrome (AIDS) patients initiating antiretroviral therapy (ART). Patients were enrolled in a randomized, placebo-controlled ART initiation trial in South Africa and Mexico to test whether maraviroc could prevent IRIS. Participants were classified prospectively as having IRIS within 6 months of ART initiation. Twenty plasma biomarkers were measured at study enrollment for 267 participants. Biomarkers were tested for predicting IRIS with adjustment for covariates chosen through forward stepwise selection. Sixty-two participants developed IRIS and of these 19 were tuberculosis (TB)-IRIS. Baseline levels of vitamin D and higher d-dimer, interferon gamma (IFNγ), and sCD14 were independently associated with risk of IRIS in multivariate analyses. TB-IRIS cases exhibited a distinct biosignature from IRIS related to other pathogens, with increased levels of C-reactive protein (CRP), sCD14, IFNγ, and lower levels of Hb that could be captured by a composite risk score. Elevated markers of Type 1 T helper (Th1) response, monocyte activation, coagulation and low vitamin D were independently associated with IRIS risk. Interventions that decrease immune activation and increase vitamin D levels warrant further study.
MapMarker: Extraction of Postal Addresses and Associated Information for General Web Pages
Address information is essential for people’s daily life. People often need to query addresses of unfamiliar location through Web and then use map services to mark down the location for direction purpose. Although both address information and map services are available online, they are not well combined. Users usually need to copy individual address from a Web site and paste it to another Web site with map services to locate its direction. Such copy and paste operations have to be repeated if multiple addresses are listed on a single page such as public school list or apartment list. Furthermore, associated information with individual address has to be copied and included on each marker for better comprehension. Our research is devoted to automate the above process and make the combination an easier task for users. The main techniques applied here include postal address extraction and associated information extraction. We apply sequence labeling algorithm based on Conditional Random Fields (CRFs) to train models for address extraction. Meanwhile, using the extracted addresses as landmarks, we apply pattern mining to identify the boundaries of address blocks and extract associated information with each individual address. The experimental result shows high F-score at 91% for postal address extraction and 87% accuracy for associated information extraction.
Instant Object Detection in Lidar Point Clouds
In this letter, we present a new approach for object classification in continuously streamed Lidar point clouds collected from urban areas. The input of our framework is raw 3-D point cloud sequences captured by a Velodyne HDL-64 Lidar, and we aim to extract all vehicles and pedestrians in the neighborhood of the moving sensor. We propose a complete pipeline developed especially for distinguishing outdoor 3-D urban objects. First, we segment the point cloud into regions of ground, short objects (i.e., low foreground), and tall objects (high foreground). Then, using our novel two-layer grid structure, we perform efficient connected component analysis on the foreground regions, for producing distinct groups of points, which represent different urban objects. Next, we create depth images from the object candidates, and apply an appearance-based preliminary classification by a convolutional neural network. Finally, we refine the classification with contextual features considering the possible expected scene topologies. We tested our algorithm in real Lidar measurements, containing 1485 objects captured from different urban scenarios.
Learning drivers for TORCS through imitation using supervised methods
In this paper, we apply imitation learning to develop drivers for The Open Racing Car Simulator (TORCS). Our approach can be classified as a direct method in that it applies supervised learning to learn car racing behaviors from the data collected from other drivers. In the literature, this approach is known to have led to extremely poor performance with drivers capable of completing only very small parts of a track. In this paper we show that, by using high-level information about the track ahead of the car and by predicting high-level actions, it is possible to develop drivers with performances that in some cases are only 15% lower than the performance of the fastest driver available in TORCS. Our experimental results suggest that our approach can be effective in developing drivers with good performance in non-trivial tracks using a very limited amount of data and computational resources. We analyze the driving behavior of the controllers developed using our approach and identify perceptual aliasing as one of the factors which can limit performance of our approach.
Virgin birth in a hammerhead shark.
Parthenogenesis has been documented in all major jawed vertebrate lineages except mammals and cartilaginous fishes (class Chondrichthyes: sharks, batoids and chimeras). Reports of captive female sharks giving birth despite being held in the extended absence of males have generally been ascribed to prior matings coupled with long-term sperm storage by the females. Here, we provide the first genetic evidence for chondrichthyan parthenogenesis, involving a hammerhead shark (Sphyrna tiburo). This finding also broadens the known occurrence of a specific type of asexual development (automictic parthenogenesis) among vertebrates, extending recently raised concerns about the potential negative effect of this type of facultative parthenogenesis on the genetic diversity of threatened vertebrate species.
Handling multiple objectives with particle swarm optimization
This paper presents an approach in which Pareto dominance is incorporated into particle swarm optimization (PSO) in order to allow this heuristic to handle problems with several objective functions. Unlike other current proposals to extend PSO to solve multiobjective optimization problems, our algorithm uses a secondary (i.e., external) repository of particles that is later used by other particles to guide their own flight. We also incorporate a special mutation operator that enriches the exploratory capabilities of our algorithm. The proposed approach is validated using several test functions and metrics taken from the standard literature on evolutionary multiobjective optimization. Results indicate that the approach is highly competitive and that can be considered a viable alternative to solve multiobjective optimization problems.
Translation and validation of a Chinese language version of the Early Childhood Oral Health Impact Scale (ECOHIS).
OBJECTIVE This study aimed to adapt the Early Childhood Oral Health Impact Scale (ECOHIS) for pre-school children in a Chinese speaking community and to investigate its psychometric properties (validity and reliability). METHODS A Chinese language version of the ECOHIS was derived through a forward-backward translation and tested for face and content validity among a focus group. A convenient sample of pre-school children (n = 111) was recruited (including a sub-sample with early childhood caries and caries-free children). Parents of the children self-completed the derived Chinese-ECOHIS measure. Validity of the measure was assessed by investigating the relationship between dental caries status and Chinese-ECOHIS scores (construct and criterion validity). A sub-sample of the parents repeated the ratings of the measure to enable reliability assessments. Both internal and test-retest reliability were determined. RESULTS A Chinese version of ECOHIS was derived with minor modification to the original version. Chinese-ECOHIS scores were associated with children's caries experience (dmft) (r = 0.66, P < 0.05) supporting convergent validity. In addition, variations in ECOHIS scores were apparent with respect to caries and caries-free groups (P < 0.001), supporting the ability to distinguish between patient groups. Cronbach's alpha values (internal reliability) for total ECOHIS score were 0.91 and intraclass correlation coefficient value (test-retest reliability) was 0.64. CONCLUSIONS A Chinese version of the ECOHIS was developed and demonstrated acceptable validity and reliability. These findings can enable assessments of pre-school child oral health-related quality of life in Chinese speaking communities.
A precise and compact height of burst sensor for guided missiles
Height of Burst (HOB) sensor is one of the critical parts in guided missiles. While seekers control the guiding scheme of the missile, proximity sensors set the trigger for increased effectiveness of the warhead. For the well-developed guided missiles of Roketsan, a novel proximity sensor is developed. The design of the sensor is for multi-purpose use. In this presentation, the application of the sensor is explained for operation as a HOB sensor in the range of 3m–50m with ± 1m accuracy. Measurement results are also presented. The same sensor is currently being developed for proximity sensor for missile defence.
Phase I study of tanespimycin in combination with bortezomib in patients with advanced solid malignancies
Purpose To determine the maximum tolerated dose (MTD) and characterize the dose-limiting toxicities (DLT) of tanespimycin when given in combination with bortezomib. Experimental design Phase I dose-escalating trial using a standard cohort “3+3” design performed in patients with advanced solid tumors. Patients were given tanespimycin and bortezomib twice weekly for 2 weeks in a 3 week cycle (days 1, 4, 8, 11 every 21 days). Results Seventeen patients were enrolled in this study, fifteen were evaluable for toxicity, and nine patients were evaluable for tumor response. The MTD was 250 mg/m2 of tanespimycin and 1.0 mg/m2 of bortezomib when used in combination. DLTs of abdominal pain (13 %), complete atrioventricular block (7 %), fatigue (7 %), encephalopathy (7 %), anorexia (7 %), hyponatremia (7 %), hypoxia (7 %), and acidosis (7 %) were observed. There were no objective responses. One patient had stable disease. Conclusions The recommended phase II dose for twice weekly 17-AAG and PS341 are 250 mg/m2 and 1.0 mg/m2, respectively, on days 1, 4, 8 and 11 of a 21 day cycle.
Secure End-to-End key establishment protocol for resource-constrained healthcare sensors in the context of IoT
Internet of Things (IoT) is an ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to transmit data over the network. Sensor nodes along with their heterogeneous nature are the main part of the IoT and may act as internet hosts or clients. Communication security and end-users privacy protection is a major concern in the development of IoT especially if these IP-enabled sensor nodes have limited resources. Secret key distribution for heterogeneous sensors becomes challenging due to the inconsistencies in their cryptographic primitives and computational resources as in Healthcare applications. This paper introduces a new End-to-End key establishment protocol that is lightweight for resource-constrained sensors and secure through strong encryption and authentication. By using this protocol, resource-constrained nodes can also benefit from the same security functionalities that are typical of unconstrained domains, without however having to execute computationally intensive operations. The protocol is based on cooperation by offloading the heavy cryptographic operations of constrained nodes to the neighboring trusted nodes or devices. Security analysis and performance evaluation results show that the proposed protocol is secure and is sufficiently energy efficient.
Preference-based Teaching
We introduce a new model of teaching named "preference-based teaching" and a corresponding complexity parameter---the preference-based teaching dimension (PBTD)---representing the worst-case number of examples needed to teach any concept in a given concept class. Although the PBTD coincides with the well-known recursive teaching dimension (RTD) on finite classes, it is radically different on infinite ones: the RTD becomes infinite already for trivial infinite classes (such as half-intervals) whereas the PBTD evaluates to reasonably small values for a wide collection of infinite classes including classes consisting of so-called closed sets w.r.t. a given closure operator, including various classes related to linear sets over $\mathbb{N}_0$ (whose RTD had been studied quite recently) and including the class of Euclidean half-spaces. On top of presenting these concrete results, we provide the reader with a theoretical framework (of a combinatorial flavor) which helps to derive bounds on the PBTD.
Adverse Drug Event Discovery Using Biomedical Literature: A Big Data Neural Network Adventure
BACKGROUND The study of adverse drug events (ADEs) is a tenured topic in medical literature. In recent years, increasing numbers of scientific articles and health-related social media posts have been generated and shared daily, albeit with very limited use for ADE study and with little known about the content with respect to ADEs. OBJECTIVE The aim of this study was to develop a big data analytics strategy that mines the content of scientific articles and health-related Web-based social media to detect and identify ADEs. METHODS We analyzed the following two data sources: (1) biomedical articles and (2) health-related social media blog posts. We developed an intelligent and scalable text mining solution on big data infrastructures composed of Apache Spark, natural language processing, and machine learning. This was combined with an Elasticsearch No-SQL distributed database to explore and visualize ADEs. RESULTS The accuracy, precision, recall, and area under receiver operating characteristic of the system were 92.7%, 93.6%, 93.0%, and 0.905, respectively, and showed better results in comparison with traditional approaches in the literature. This work not only detected and classified ADE sentences from big data biomedical literature but also scientifically visualized ADE interactions. CONCLUSIONS To the best of our knowledge, this work is the first to investigate a big data machine learning strategy for ADE discovery on massive datasets downloaded from PubMed Central and social media. This contribution illustrates possible capacities in big data biomedical text analysis using advanced computational methods with real-time update from new data published on a daily basis.
Motivation of a new approach for shape reconstruction based on FBG-optical fibers: Considering of the Bragg-gratings composition as a sensornetwork
In various fields of application, the shape and the tip position of flexible, snakelike objects have to be reconstructed. For this, the considered objects are fitted with so-called shape sensors. This shape sensors are e.g. applied in medical technology to support minimally invasive surgical interventions by tracking flexible instruments; this way navigation systems can be considerably supported. The sensors consist of a solid snakelike body out of flexible carrier material, as silicone, with embedded FBG - optical glass fibers along the object-axis. Guided along the observed instruments, the sensor is supposed to detect the instruments shape by detecting its own ones. The fibers measure the strain at discrete points along the sensor body, which is caused by deformation of the sensor. From these values the shape is estimated. This estimation is performed using specific algorithms. Accordingly, certain requirements regarding the position, orientation and exact number of the measurement units are made. As part of the manufacturing process of the sensor, however, exact control of fiber positioning cannot be realized. To compensate this inaccuracy and also further occurring problems, a fundamentally new calculation approach is presented in this paper. The basic idea is, to consider the system of measurement units as a sensor network. The position and orientation of the units are not considered to be static, because they can only be detected after production but cannot be exactly implemented in a controlled way with a planned position and orientation. The idea is realized by initializing a tensor field on a manifold, representing the surface of the object. This allows to apply the algorithm to measurement values, measured at randomly distributed positions along the sensor body. The new approach is promising and more accuracy in shape sensing is expected do be achieved. The approach of surface characterization is developed in a way that it is transferable to other applications. In the future, also areas in general can be analysed by applying to adapted algorithms based on the same idea. Interpolation of e.g. temperature- and radiation fields can be done in an intelligent way by measuring discrete values by efficiently distributed measurement units.
Extraction of Information Related to Adverse Drug Events from Electronic Health Record Notes: Design of an End-to-End Model Based on Deep Learning
BACKGROUND Pharmacovigilance and drug-safety surveillance are crucial for monitoring adverse drug events (ADEs), but the main ADE-reporting systems such as Food and Drug Administration Adverse Event Reporting System face challenges such as underreporting. Therefore, as complementary surveillance, data on ADEs are extracted from electronic health record (EHR) notes via natural language processing (NLP). As NLP develops, many up-to-date machine-learning techniques are introduced in this field, such as deep learning and multi-task learning (MTL). However, only a few studies have focused on employing such techniques to extract ADEs. OBJECTIVE We aimed to design a deep learning model for extracting ADEs and related information such as medications and indications. Since extraction of ADE-related information includes two steps-named entity recognition and relation extraction-our second objective was to improve the deep learning model using multi-task learning between the two steps. METHODS We employed the dataset from the Medication, Indication and Adverse Drug Events (MADE) 1.0 challenge to train and test our models. This dataset consists of 1089 EHR notes of cancer patients and includes 9 entity types such as Medication, Indication, and ADE and 7 types of relations between these entities. To extract information from the dataset, we proposed a deep-learning model that uses a bidirectional long short-term memory (BiLSTM) conditional random field network to recognize entities and a BiLSTM-Attention network to extract relations. To further improve the deep-learning model, we employed three typical MTL methods, namely, hard parameter sharing, parameter regularization, and task relation learning, to build three MTL models, called HardMTL, RegMTL, and LearnMTL, respectively. RESULTS Since extraction of ADE-related information is a two-step task, the result of the second step (ie, relation extraction) was used to compare all models. We used microaveraged precision, recall, and F1 as evaluation metrics. Our deep learning model achieved state-of-the-art results (F1=65.9%), which is significantly higher than that (F1=61.7%) of the best system in the MADE1.0 challenge. HardMTL further improved the F1 by 0.8%, boosting the F1 to 66.7%, whereas RegMTL and LearnMTL failed to boost the performance. CONCLUSIONS Deep learning models can significantly improve the performance of ADE-related information extraction. MTL may be effective for named entity recognition and relation extraction, but it depends on the methods, data, and other factors. Our results can facilitate research on ADE detection, NLP, and machine learning.
The valve movement response of mussels: a tool in biological monitoring
Biological sensors are becoming more important to monitor the quality of the aquatic environment. In this paper the valve movement response of freshwater (Dreissena polymorpha) and marine (Mytilus edulis) mussels is presented as a tool in monitoring studies. Examples of various methods for data storage and data treatment are presented, elucidating easier operation and lower detection limits. Several applications are mentioned, including an early warning system based on this valve movement response of mussels.
Investigation of Flotation Parameters for Copper Recovery from Enargite and Chalcopyrite Mixed Ore
A flotation pre-treatment study for the separation of enargite (Cu3AsS4) from chalcopyrite (CuFeS2) ores of different origins was investigated in this work. The copper ore bearing enargite mineral contained 5.87mass% As and 16.50mass% Cu while the chalcopyrite bearing ore contained 0.32mass% As and 21.63mass% Cu. The two ore samples were mixed at 7 : 3 (enargite : chalcopyrite) by weight ratio to prepare a mixed ore sample with As content at 3.16 and 18.25mass% Cu for the flotation study. Effect of particle size, slurry pH, flotation time, collector type, collector addition or dosage and depressants were investigated to evaluate efficiency of enargite separation from chalcopyrite and recovery of both minerals as separate concentrates. For enargite single ore flotation, the 38­75μm size fraction showed that over 98% of enargite was selectively recovered within 5min at slurry pH of 4 and As content in the final tailings was reduced to 0.22mass%. In mix ore (enargite + chalcopyrite) flotation, 97% of enargite was first removed at pH 4 followed by chalcopyrite flotation at pH 8, and over 95% recovery was achieved in 15min flotation time. The As content in the final tailings was reduced to 0.1mass%. [doi:10.2320/matertrans.M2011354]
The centrality of pivotal points in the evolution of scientific networks
In this paper, we describe the development of CiteSpace as an integrated environment for identifying and tracking thematic trends in scientific literature. The goal is to simplify the process of finding not only highly cited clusters of scientific articles, but also pivotal points and trails that are likely to characterize fundamental transitions of a knowledge domain as a whole. The trails of an advancing research field are captured through a sequence of snapshots of its intellectual structure over time in the form of Pathfinder networks. These networks are subsequently merged with a localized pruning algorithm. Pivotal points in the merged network are algorithmically identified and visualized using the betweenness centrality metric. An example of finding clinical evidence associated with reducing risks of heart diseases is included to illustrate how CiteSpace could be used. The contribution of the work is its integration of various change detection algorithms and interactive visualization capabilities to simply users' tasks.
Identifying Justifications in Written Dialogs by Classifying Text as Argumentative
In written dialog, discourse participants need to justify claims they make, to convince the reader the claim is true and/or relevant to the discourse. This paper presents a new task (with an associated corpus), namely detecting such justifications. We investigate the nature of such justifications, and observe that the justifications themselves often contain discourse structure. We therefore develop a method to detect the existence of certain types of discourse relations, which helps us classify whether a segment is a justification or not. Our task is novel, and our work is novel in that it uses a large set of connectives (which we call indicators), and in that it uses a large set of discourse relations, without choosing among them.
Teaching Life-Saving Manoeuvres in Primary School
Introduction. In the event of sudden cardiac arrest (SCA) early intervention provided by a layperson can be life-saving. Teaching first aid in primary school may increase the lifelong ability and motivation of young people to take action in an emergency. Objective. The aim of this article is to report a training experience on BLSD (Basic Life Support and Defibrillation) designed for a group of pupils in an Italian primary school, with assessment of its effectiveness at a distance. Methods. The assessment was carried out using a multiple choice questionnaire on a sample of 130 pupils aged 11-12, 62 trained in BLSD and 68 as a control group. The trained group also performed an emergency simulation to assess their learning of practical skills. Results. Using the t test, significant differences emerged in the questionnaire scores between the case-control group. The results of the skill test were positive, even for the most difficult manoeuvres such as opening airways, assessing breathing, or using an AED (Automated External Defibrillator). Conclusion. Although there are still some open questions regarding the ability to retain these skills in the medium/long term, the study shows that life-saving manoeuvres can be effectively taught to primary school pupils.
Similarity of binaries through re-optimization
We present a scalable approach for establishing similarity between stripped binaries (with no debug information). The main challenge in binary similarity, is to establish similarity even when the code has been compiled using different compilers, with different optimization levels, or targeting different architectures. Overcoming this challenge, while avoiding false positives, is invaluable to the process of reverse engineering and the process of locating vulnerable code. We present a technique that is scalable and precise, as it alleviates the need for heavyweight semantic comparison by performing out-of-context re-optimization of procedure fragments. It works by decomposing binary procedures to comparable fragments and transforming them to a canonical, normalized form using the compiler optimizer, which enables finding equivalent fragments through simple syntactic comparison. We use a statistical framework built by analyzing samples collected “in the wild” to generate a global context that quantifies the significance of each pair of fragments, and uses it to lift pairwise fragment equivalence to whole procedure similarity. We have implemented our technique in a tool called <pre>GitZ</pre> and performed an extensive evaluation. We show that <pre>GitZ</pre> is able to perform millions of comparisons efficiently, and find similarity with high accuracy.
Biodiversity in the city: key challenges for urban green space management
Cities play important roles in the conservation of global biodiversity, particularly through the planning and management of urban green spaces (UGS). However, UGS management is subject to a complex assortment of interacting social, cultural, and economic factors, including governance, economics, social networks, multiple stakeholders, individual preferences, and social constraints. To help deliver more effective conservation outcomes in cities, we identify major challenges to managing biodiversity in UGS and important topics warranting further investigation. Biodiversity within UGS must be managed at multiple scales while accounting for various socioeconomic and cultural influences. Although the environmental consequences of management activities to enhance urban biodiversity are now beginning to be addressed, additional research and practical management strategies must be developed to balance human needs and perceptions while maintaining ecological processes.
Safe semi-autonomous control with enhanced driver modeling
During semi-autonomous driving, threat assessment is used to determine when controller intervention that overwrites or corrects the driver's input is required. Since today's semi-autonomous systems perform threat assessment by predicting the vehicle's future state while treating the driver's input as a disturbance, controller intervention is limited to just emergency maneuvers. In order to improve vehicle safety and reduce the aggressiveness of maneuvers, threat assessment must occur over longer prediction horizons where driver's behavior cannot be neglected. We propose a framework that divides the problem of semi-autonomous control into two components. The first component reliably predicts the vehicle's potential behavior by using empirical observations of the driver's pose. The second component determines when the semi-autonomous controller should intervene. To quantitatively measure the performance of the proposed approach, we define metrics to evaluate the infor-mativeness of the prediction and the utility of the intervention procedure. A multi-subject driving experiment illustrates the usefulness, with respect to these metrics, of incorporating the driver's pose while designing a semi-autonomous system.
Overcoming resource underutilization in spatial CNN accelerators
Convolutional neural networks (CNNs) are revolutionizing a variety of machine learning tasks, but they present significant computational challenges. Recently, FPGA-based accelerators have been proposed to improve the speed and efficiency of CNNs. Current approaches construct an accelerator optimized to maximize the overall throughput of iteratively computing the CNN layers. However, this approach leads to dynamic resource underutilization because the same accelerator is used to compute CNN layers of radically varying dimensions. We present a new CNN accelerator design that improves the dynamic resource utilization. Using the same FPGA resources, we build multiple accelerators, each specialized for specific CNN layers. Our design achieves 1.3× higher throughput than the state of the art when evaluating the convolutional layers of the popular AlexNet CNN on a Xilinx Virtex-7 FPGA.
A Convolutional Neural Network for Modelling Sentences
The ability to accurately represent sentences is central to language understanding. We describe a convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) that we adopt for the semantic modelling of sentences. The network uses Dynamic k-Max Pooling, a global pooling operation over linear sequences. The network handles input sentences of varying length and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations. The network does not rely on a parse tree and is easily applicable to any language. We test the DCNN in four experiments: small scale binary and multi-class sentiment prediction, six-way question classification and Twitter sentiment prediction by distant supervision. The network achieves excellent performance in the first three tasks and a greater than 25% error reduction in the last task with respect to the strongest baseline.
A What-and-Where fusion neural network for recognition and tracking of multiple radar emitters
A neural network recognition and tracking system is proposed for classification of radar pulses in autonomous Electronic Support Measure systems. Radar type information is considered with position-specific information from active emitters in a scene. Type-specific parameters of the input pulse stream are fed to a neural network classifier trained on samples of data collected in the field. Meanwhile, a clustering algorithm is used to separate pulses from different emitters according to position-specific parameters of the input pulse stream. Classifier responses corresponding to different emitters are separated into tracks, or trajectories, one per active emitter, allowing for more accurate identification of radar types based on multiple views of emitter data along each emitter trajectory. Such a What-and-Where fusion strategy is motivated by a similar subdivision of labor in the brain. The fuzzy ARTMAP neural network is used to classify streams of pulses according to radar type using their functional parameters. Simulation results obtained with a radar pulse data set indicate that fuzzy ARTMAP compares favorably to several other approaches when performance is measured in terms of accuracy and computational complexity. Incorporation into fuzzy ARTMAP of negative match tracking (from ARTMAP-IC) facilitated convergence during training with this data set. Other modifications improved classification of data that include missing input pattern components and missing training classes. Fuzzy ARTMAP was combined with a bank of Kalman filters to group pulses transmitted from different emitters based on their position-specific parameters, and with a module to accumulate evidence from fuzzy ARTMAP responses corresponding to the track defined for each emitter. Simulation results demonstrate that the system provides a high level of performance on complex, incomplete and overlapping radar data.
DDoS: Survey of Traceback Methods
The problem of identifying Distributed Denial of Service (DDoS) is one of the hardest threats in the internet security. It is important to protect the resource and trace from the Denial of Service (DoS) attack, but it is difficult to distinguish normal traffic and DoS attack traffic because the DoS generally hide their identities/origins. Especially the attackers often use incorrect or spoofed source IP address, so tracing the source of the denial of service is hardest in internet. Lot of techniques and methodologies are used to trace the DDoS attacks. This paper presents some of the mostly used predicting traceback techniques to solve the problem. The main goal of this paper is appraise the different traceback techniques of the DDoS. This paper evaluates the different traceback methods of the
QTLs affecting kernel size and shape in a two-rowed by six-rowed barley cross
The suitability of barley (Hordeum vulgare L.) grain for malting depends on many criteria, including the size, shape and uniformity of the kernels. Here, image analysis was used to measure kernel size and shape attributes (area, perimeter, length, width, F-circle and F-shape) in grain samples of 140 doubled-haploid lines from a two-rowed (cv Harrington) by six-rowed (cv Morex) barley cross. Interval mapping was used to map quantitative trait loci (QTLs) affecting the means and within-sample standard deviations of these attributes using a 107-marker genome map. Regions affecting one or more kernel size and shape traits were detected on all seven chromosomes. These included one near the vrs1 locus on chromosome 2 and one near the int-c locus on chromosome 4. Some, but not all, of the QTLs exhibited interactions with the environment and some QTLs affected the within-sample variability of kernel size and shape without affecting average kernel size and shape. When QTL analysis was conducted using data from only the two-rowed lines, the region on chromosome 2 was not detected but QTLs were detected elsewhere in the genome, including some that had not been detected in the analysis of the whole population. Analysis of only the six-rowed lines did not detect any QTLs affecting kernel size and shape attributes. QTL alleles that made kernels larger and/or rounder also tended to improve malt quality and QTL alleles that increased the variability of kernel size were associated with poor malt quality.
EVALUATION OF EFFECTIVENESS IN A NOVEL WOUND HEALING OINTMENT-CROCODILE OIL BURN OINTMENT
BACKGROUND Crocodile oil and its products are used as ointments for burns and scalds in traditional medicines. A new ointment formulation - crocodile oil burn ointment (COBO) was developed to provide more efficient wound healing activity. The purpose of the study was to evaluate the burn healing efficacy of this new formulation by employing deep second-degree burns in a Wistar rat model. The analgesic and anti-inflammatory activities of COBO were also studied to provide some evidences for its further use. MATERIALS AND METHODS The wound healing potential of this formulation was evaluated by employing a deep second-degree burn rat model and the efficiency was comparatively assessed against a reference ointment - (1% wt/wt) silver sulfadiazine (SSD). After 28 days, the animals were euthanized and the wounds were removed for transversal and longitudinal histological studies. Acetic acid-induced writhing in mice was used to evaluate the analgesic activity and its anti-inflammatory activity was observed in xylene -induced edema in mice. RESULTS COBO enhanced the burn wound healing (20.5±1.3 d) as indicated by significant decrease in wound closure time compared with the burn control (25.0±2.16 d) (P<0.01). Hair follicles played an importance role in the physiological functions of the skin, and their growth in the wound could be revealed for the skin regeneration situation. Histological results showed that the hair follicles were well-distributed in the post-burn skin of COBO treatment group, and the amounts of total, active, primary and secondary hair follicles in post-burn 28-day skin of COBO treatment groups were more than those in burn control and SSD groups. On the other hand, the analgesic and anti-inflammatory activity of COBO were much better than those of control group, while they were very close to those of moist exposed burn ointment (MEBO). CONCLUSIONS COBO accelerated wound closure, reduced inflammation, and had analgesic effects compared with SSD in deep second degree rat burn model. These findings suggest that COBO would be a potential therapy for treating human burns. Abbreviations: COBO, crocodile oil burn ointment; SSD, silver sulfadiazine; MEBO, moist exposed burn ointment; TCM, traditional Chinese medicine; CHM, Chinese herbal medicine; GC-MS, gas chromatography-mass spectrometry.
Mobile taskflow in context: a screenshot study of smartphone usage
The impact of interruptions on workflow and productivity has been extensively studied in the PC domain, but while fragmented user attention is recognized as an inherent aspect of mobile phone usage, little formal evidence exists of its effect on mobile productivity. Using a survey and a screenshot-based diary study we investigated the types of barriers people face when performing tasks on their mobile phones, the ways they follow up with such suspended tasks, and how frustrating the experience of task disruption is for mobile users. From 386 situated samples provided by 12 iPhone and 12 Pocket PC users, we distill a classification of barriers to the completion of mobile tasks. Our data suggest that moving to a PC to complete a phone task is common, yet not inherently problematic, depending on the task. Finally, we relate our findings to prior design guidelines for desktop workflow, and discuss how the guidelines can be extended to mitigate disruptions to mobile taskflow.
Differential Cryptanalysis of Feal and N-Hash
In 1,2] we introduced the notion of diierential cryptanalysis and described its application to DESS11] and several of its variants. In this paper we show the applicability of diierential cryptanalysis to the Feal family of encryption algorithms and to the N-Hash hash function. In addition, we show how to transform diierential cryptanalytic chosen plaintext attacks into known plaintext attacks.
Einstein and the Kaluza-Klein particle
Abstract In his search for a unified field theory that could undercut quantum mechanics, Einstein considered five-dimensional classical Kaluza–Klein theory. He studied this theory most intensively during the years 1938–1943. One of his primary objectives was finding a non-singular particle solution. In the full theory this search got frustrated, and in the x 5 -independent theory Einstein, together with Pauli, argued it would be impossible to find these structures.
Clinical and pharmacokinetic phase II study of fotemustine in refractory and relapsing multiple myeloma patients.
BACKGROUND Patients with relapsing or refractory multiple myeloma have poor prognosis. Few compounds are active in these patients and response duration remains short. We report the results of an open phase II trial evaluating the efficacy and safety of fotemustine monotherapy. PATIENTS AND METHODS Twenty-one patients with relapsing (17) or refractory (four) multiple myeloma received fotemustine 100 mg/m(2) on an outpatient basis on days 1 and 8 of the induction cycle, followed after a 6-week rest period by fotemustine 100 mg/m(2) every 3 weeks until progression or unacceptable toxicity. Fotemustine pharmacokinetics during the first day of induction was compared between patients with normal or abnormal renal function. RESULTS Five of 20 eligible patients had an objective response giving an intention-to-treat response rate of 25% [95% confidence interval (CI) 6% to 44%] and a 35.7% response rate (95% CI 11% to 61%) in the 14 patients having received at least four injections of fotemustine. The median time to objective response was 8.9 months. The median times to progression and survival were 13.8 and 23.1 months, respectively, with a 2-year survival rate of 49%. The main toxicity was myelosuppression with grade 3-4 neutropenia and thrombocytopenia in 66% and 71% of patients, respectively. There was one toxic death by sepsis after induction. The pharmacokinetic parameters in renal-impaired patients were not significantly different from those in patients with normal renal function with a similar incidence of grade 3-4 toxicity in both groups. CONCLUSIONS Fotemustine as a single agent has definite activity in patients with relapsing or refractory multiple myeloma, with acceptable toxicity and can be administered at conventional doses in patients with mild or moderate renal impairment.
Semantic Search of Unstructured Data using Contextual Network Graphs
The authors present a graph-based algorithm for searching potentially large collections of unstructured data, and discuss its implementation as a search engine designed to offer advanced relevance feedback features to users who may have limited familiarity with search tools. The technique, which closely resembles the spreading activation network model described by Scott Preece, uses a term-document matrix to generate a bipartite graph of term and document nodes representing the document collection. This graph can be searched by a simple recursive procedure that distributes energy from an initial query node. Nodes that acquire energy above a specified threshold comprise the result set. Initial results on live collections suggest that this technique may offer performance comparable to latent semantic indexing (LSI), while avoiding some of that technique’s computational pitfalls. Both the algorithm and its implementation in a production Web environment are discussed.
Design and analysis of a completely decoupled compliant parallel XY micro-motion stage
With the purpose of designing a completely decoupled XY micro-motion stage, a novel flexure hinge-based compliant parallel mechanism driven by piezoelectric actuators (PZT) is presented in this paper. The mechanism with a double symmetric structure is constructed by employing double fourbar flexures as prismatic joints due to their better stiffness performance. To obtain an accurate model, matrix method is applied for the compliance analysis. Then the dynamics model is derived via Lagrange equation. Finally, finite element analysis (FEA) is carried out using ANSYS software to validate the models and evaluate the performance of the mechanism. The simulation results also reveal that the mechanism has ideal linearity in terms of the static properties.
Networks, Netwar, and Information- Age Terrorism a New Terrorism (with Old Roots)
The rise of network forms of organization is a key consequence of the ongoing information revolution. Business organizations are being newly energized by networking, and many professional militaries are experimenting with flatter forms of organization. In this chapter, we explore the impact of networks on terrorist capabilities, and consider how this development may be associated with a move away from emphasis on traditional, episodic efforts at coercion to a new view of terror as a form of protracted warfare. Seen in this light, the recent bombings of U.S. embassies in East Africa, along with the retaliatory American missile strikes, may prove to be the opening shots of a war between a leading state and a terror network. We consider both the likely context and the conduct of such a war, and offer some insights that might inform policies aimed at defending against and countering terrorism.
Coalition Battle Management Language (C-BML) Study Group Report
The objective of Battle Management Language (BML) is to define an unambiguous language to describe a commander’s intent, to be understood by both live forces and automated systems, for simulated and realworld operations. The resulting language is intended to be applicable not only to simulation systems, but also to operational command and control systems, and robotic systems. Within the last three years, multiple papers presented at the Simulation Interoperability Workshops (SIW) have dealt with the need for, and initial work in, Modeling & Simulation (M&S) to Command and Control (C2) Interoperability based on the use of unambiguous mission and task definitions. During the Spring 2004 SIW, a meeting of subject matter experts determined that a detailed evaluation of BML efforts at a Coalition level is necessary and subsequently drafted Terms of Reference (TOR) for a Simulation Interoperability Standards Organization (SISO) Study Group. The TOR for the Coalition BML (C-BML) Study Group was accepted by the SISO Standards Activity Committee and identifies the following tasks: • The Study Group shall conduct a Survey comprising as many international contributions applicable to the Coalition BML effort as possible. • The Study Group shall develop a plan for how these various efforts can contribute to a common Coalition BML specification within a methodological framework. • The Study Group shall formulate a set of Recommendations for a Coalition BML Product Development Group (PDG). The Coalition BML Study Group was subsequently formed in September 2004 to address these tasks. The Study Group has conducted a number of face-to-face and teleconference meetings through the year since the Fall 2004 SIW, involving a membership of over 100 persons from 11 different countries. This paper is an executive summary of the full Study Group Final Report. As the Study Group concludes, it recommends that a PDG be formed. The C-BML Study Group has worked closely with the Military Scenario Definition Language (MSDL) Study Group to coordinate both PDG proposals to ensure a consistent set of standards for initialization, tasking and reporting.
Benchmarking Web API Quality
Web APIs are increasingly becoming an integral part of web or mobile applications. As a consequence, performance characteristics and availability of the APIs used directly impact the user experience of end users. Still, quality of web APIs is largely ignored and simply assumed to be sufficiently good and stable. Especially considering geo-mobility of today’s client devices, this can lead to negative surprises at runtime. In this work, we present an approach and toolkit for benchmarking the quality of web APIs considering geo-mobility of clients. Using our benchmarking tool, we then present the surprising results of a geo-distributed 3-month benchmark run for 15 web APIs and discuss how application developers can deal with volatile quality both from an architectural and engineering point of view.
Effects of Nasal CPAP Treatment on Insulin Resistance, Lipid Profile, and Plasma Leptin in Sleep Apnea
Obstructive sleep apnea has been linked with metabolic syndrome characterized by dyslipidemia, dyscoagulation, hypertension, and diabetes mellitus type 2 and their cardiovascular consequences. This study was designed to determine the effects of 8 weeks of therapy with continuous positive airway pressure (CPAP) on insulin resistance, glucose, and lipid profile, and the relationship between leptin and insulin-resistance parameters in patients with moderate-to-severe obstructive sleep apnea. In 44 patients, serum cholesterol, triglycerides, high-density lipoprotein, low-density lipoprotein, very low-density lipoprotein, leptin, and insulin parameters were measured at baseline and after 8 weeks of CPAP. Insulin resistance index was based on the homeostasis model assessment (HOMA-IR) method. Insulin sensitivity (HOMA-S) and insulin secretion capacity (HOMA-β) also were calculated. Thirteen patients were excluded from statistical analyses due to noncompliant CPAP usage (<4 h night−1). In 31 patients who used CPAP for ≥4 h night−1, CPAP therapy reduced total cholesterol (P < 0.05), low-density lipoprotein (P < 0.05), and leptin (P < 0.05). Circulating leptin levels showed significant correlation with both HOMA-S and HOMA-IR at baseline and follow-up (P = 0.03 for all). In addition, there was no correlation between HOMA-IR and the severity of sleep apnea, which was shown by apnea-hypopnea index. In patients with moderate-to-severe obstructive sleep apnea, compliant CPAP usage may improve insulin secretion capacity, reduce leptin, total cholesterol, and low-density lipoprotein levels. Leptin showed significant relationship with insulin resistance, and this relationship remained after 8 weeks of CPAP therapy.
A Pedestrian Detector Using Histograms of Oriented Gradients and a Support Vector Machine Classifier
This paper details filtering subsystem for a tetra-vision based pedestrian detection system. The complete system is based on the use of both visible and far infrared cameras; in an initial phase it produces a list of areas of attention in the images which can contain pedestrians. This list is furtherly refined using symmetry-based assumptions. Then, this results is fed to a number of independent validators that evaluate the presence of human shapes inside the areas of attention. Histogram of oriented gradients and Support Vector Machines are used as a filter and demonstrated to be able to successfully classify up to 91% of pedestrians in the areas of attention.
The Chronotron: A Neuron That Learns to Fire Temporally Precise Spike Patterns
In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that provides high memory capacity (E-learning), and one that has a higher biological plausibility (I-learning). With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.
Transforming Web Tables to a Relational Database
HTML tables represent a significant fraction of web data. The often complex headers of such tables are determined accurately using their indexing property. Isolated headers are factored to extract category hierarchies. Web tables are then transformed into a canonical form and imported into a relational database. The proposed processing allows for the formulation of arbitrary SQL queries over the collection of induced relational tables. Keywords—table segmentation; Wang categories; header paths; relational table SQL queries
Advances in Topological Vulnerability Analysis
Currently, network administrators must rely on labor-intensive processes for tracking network configurations and vulnerabilities, which requires a great deal of expertise and is error prone. The organization of networks and the inter dependencies of vulnerabilities are so complex as to make traditional vulnerability analysis inadequate. We describe a Topological Vulnerability Analysis (TVA) approach that analyzes vulnerability dependencies and shows all possible attack paths into a network. From models of the network vulnerabilities and potential attacker exploits, we discover attack paths (organized as graphs) that convey the impact of individual and combined vulnerabilities on overall security. We provide sophisticated attack graph visualizations, with high-level overviews and detail drill down. Decision support capabilities let analysts make optimal tradeoffs between safety and availability, and show how to best apply limited security resources. We employ efficient algorithms that scale well to larger networks.
Extended duration of the detectable stage by adding HPV test in cervical cancer screening
The human papillomavirus test (HPV) test could improve the (cost−) effectiveness of cervical screening by selecting women with a very low risk for cervical cancer during a long period. An analysis of a longitudinal study suggests that women with a negative Pap smear and a negative HPV test have a strongly reduced risk of developing cervical abnormalities in the years following the test, and that HPV testing lengthens the detectable stage by 2–5 years, compared to Pap smear detection alone.
Recursive teaching dimension, VC-dimension and sample compression
This paper is concerned with various combinatorial parameters of classes that can be learned from a small set of examples. We show that the recursive teaching dimension, recently introduced by Zilles et al. (2008), is strongly connected to known complexity notions in machine learning, e.g., the self-directed learning complexity and the VC-dimension. To the best of our knowledge these are the first results unveiling such relations between teaching and query learning as well as between teaching and the VC-dimension. It will turn out that for many natural classes the RTD is upper-bounded by the VCD, e.g., classes of VCdimension 1, intersection-closed classes and finite maximum classes. However, we will also show that there are certain (but rare) classes for which the recursive teaching dimension exceeds the VC-dimension. Moreover, for maximum classes, the combinatorial structure induced by the RTD, called teaching plan, is highly similar to the structure of sample compression schemes. Indeed one can transform any repetition-free teaching plan for a maximum class C into an unlabeled sample compression scheme for C and vice versa, while the latter is produced by (i) the corner-peeling algorithm of Rubinstein and Rubinstein (2012) and (ii) the tail matching algorithm of Kuzmin and Warmuth (2007).
Aggressive Sampling for Multi-class to Binary Reduction with Applications to Text Classification
We address the problem of multi-class classification in the case where the number of classes is very large. We propose a double sampling strategy on top of a multi-class to binary reduction strategy, which transforms the original multi-class problem into a binary classification problem over pairs of examples. The aim of the sampling strategy is to overcome the curse of long-tailed class distributions exhibited in majority of large-scale multi-class classification problems and to reduce the number of pairs of examples in the expanded data. We show that this strategy does not alter the consistency of the empirical risk minimization principle defined over the double sample reduction. Experiments are carried out on DMOZ and Wikipedia collections with 10,000 to 100,000 classes where we show the efficiency of the proposed approach in terms of training and prediction time, memory consumption, and predictive performance with respect to state-of-the-art approaches.
Review on fraud detection methods in credit card transactions
Cashless transactions such as online transactions, credit card transactions, and mobile wallet are becoming more popular in financial transactions nowadays. With increased number of such cashless transaction, number of fraudulent transactions are also increasing. Fraud can be distinguished by analyzing spending behavior of customers (users) from previous transaction data. If any deviation is noticed in spending behavior from available patterns, it is possibly of fraudulent transaction. To detect fraud behavior, bank and credit card companies are using various methods of data mining such as decision tree, rule based mining, neural network, fuzzy clustering approach, hidden markov model or hybrid approach of these methods. Any of these methods is applied to find out normal usage pattern of customers (users) based on their past activities. The objective of this paper is to provide comparative study of different techniques to detect fraud.
Bee venom ameliorates ovalbumin induced allergic asthma via modulating CD4+CD25+ regulatory T cells in mice.
Asthma is a potentially life-threatening inflammatory disease of the lung characterized by the presence of large numbers of CD4+ T cells. These cells produce the Th2 and Th17 cytokines that are thought to orchestrate the inflammation associated with asthma. Bee venom (BV) has traditionally been used to relieve pain and to treat chronic inflammatory diseases. Recent reports have suggested that BV might be an effective treatment for allergic diseases. However, there are still unanswered questions related to the efficacy of BV therapy in treating asthma and its therapeutic mechanism. In this study, we evaluated whether BV could inhibit asthma and whether BV inhibition of asthma could be correlated with regulatory T cells (Treg) activity. We found that BV treatment increased Treg populations and suppressed the production of Th1, Th2 and Th17-related cytokines in an in vitro culture system, including IL2, IL4, and IL17. Interestingly, production of IL10, an anti-inflammatory cytokine secreted by Tregs, was significantly augmented by BV treatment. We next evaluated the effects of BV treatment on allergic asthma in an ovalbumin (OVA)-induced mouse model of allergic asthma. Cellular profiling of the bronchoalveolar lavage (BAL) and histopathologic analysis demonstrated that peribronchial and perivascular inflammatory cell infiltrates were significantly lowered following BV treatment. BV also ameliorated airway hyperresponsiveness, a hallmark symptom of asthma. In addition, IL4 and IL13 levels in the BAL fluid were decreased in the BV treated group. Surprisingly, the beneficial effects of BV treatment on asthma were eradicated following Treg depletion by anti-CD25 antibody injection, suggesting that the major therapeutic targets of BV were Tregs. These results indicate that BV efficiently diminishes bronchial inflammation in an OVA-induced allergic asthma murine model, and that this effect might correlate with Tregs, which play an important role in maintaining immune homeostasis and suppressing the function of other T cells to limit the immune response. These results also suggest that BV has potential therapeutic value for controlling allergic asthma responses.
Twin Studies and Other Genetical Investigations in the Danish Cancer Registry
League, and working in close collaboration with the National Health Service, is an organization with the primary intention to register all cancer cases, for resbarch purposes. This is done through a voluntary system ; all hospitals noti.y their cancer cases to the registry, and these notifications are supplemented with death certificates for all persons who are known to have suffered from malignant diseases, including leukaemias, myelomata and brain tumours. From Clemmesen's studies on occupational mortality from cancer in Denmark it was clear, already before the start of the Registry, that a complete investigation of the occurrence of cancer demAnded at least an estimation of the role played by heredity in the origin of various cancers. It-was thought to be especially valuable to carry out such an investigation alongside the mapping out of cancer incidence with regard to age, occupation, and other variables, and to compare the hereditary tendency shown by cancers of various sites in the same population and at the same time.
Vehicle-to-vehicle wireless communication protocols for enhancing highway traffic safety
This article presents an overview of highway cooperative collision avoidance (CCA), which is an emerging vehicular safety application using the IEEE- and ASTM-adopted Dedicated Short Range Communication (DSRC) standard. Along with a description of the DSRC architecture, we introduce the concept of CCA and its implementation requirements in the context of a vehicle-to-vehicle wireless network, primarily at the Medium Access Control (MAC) and the routing layer. An overview is then provided to establish that the MAC and routing protocols from traditional Mobile Ad Hoc networks arc not directly applicable for CCA and similar safety-critical applications. Specific constraints and future research directions are then identified for packet routing protocols used to support such applications in the DSRC environment. In order to further explain the interactions between CCA and its underlying networking protocols, we present an example of the safety performance of CCA using simulated vehicle crash experiments. The results from these experiments arc also used to demonstrate the need for network data prioritization for safety-critical applications such as CCA. Finally, the performance sensitivity of CCA to unreliable wireless channels is discussed based on the experimental results.
Advanced Persistent Threat Detection System
The Advanced Persistent Threat has quickly risen as a top-level concern for organizations of all types and sizes. Under today's security paradigm, determined attackers will eventually find their way into their target’s network, often employing social engineering tactics, phishing techniques and backdoor exploits to steal credentials and obtain access. Persistent intrusions target key users within organizations to gain access to trade secrets, intellectual property, computer source code, and any other valuable information available. In order to combat APTs, it is imperative that organizations should know what is going on within their internal networks to fill in the gaps left by perimeter security solutions. The APT detection system enables organizations to have a defence-indepth methodology. The APT system designed has a combination of modules like IDS, IPS and UTM, SIEM working together as a grid and correlate rules with each other for complete defence. The firewall provides gateway level protection against attacks. The intrusion detection system detects any sort of anomaly behaviour and threat signatures. Intrusion prevention system detects and prevents vulnerability exploits in the network. In short, the advanced persistent system designed is an incorporation of all security modules working together as a grid to provide a secure defence system as it detects low and slow attacks which do not generate usual alarms and responds real quick to the attack.
Tracking and modeling focus of attention in meetings
This thesis addresses the problem of tracking the focus of attention of people. In particular, a system to track the focus of attention of participants in meetings is developed. Obtaining knowledge about a person’s focus of attention is an important step towards a better understanding of what people do, how and with what or whom they interact or to what they refer. In meetings, focus of attention can be used to disambiguate the addressees of speech acts, to analyze interaction and for indexing of meeting transcripts. Tracking a user’s focus of attention also greatly contributes to the improvement of human-computer interfaces since it can be used to build interfaces and environments that become aware of what the user is paying attention to or with what or whom he is interacting. The direction in which people look; i.e., their gaze, is closely related to their focus of attention. In this thesis, we estimate a subject’s focus of attention based on his or her head orientation. While the direction in which someone looks is determined by head orientation and eye gaze, relevant literature suggests that head orientation alone is a sufficient cue for the detection of someone’s direction of attention during social interaction. We present experimental results from a user study and from several recorded meetings that support this hypothesis. We have developed a Bayesian approach to model at whom or what someone is looking based on his or her head orientation. To estimate head orientations in meetings, the participants’ faces are automatically tracked in the view of a panoramic camera and neural networks are used to estimate their head orientations from pre-processed images of their faces. Using this approach, the focus of attention target of subjects could be correctly identified during 73% of the time in a number of evaluation meetings with four participants. In addition, we have investigated whether a person’s focus of attention can be predicted from other cues. Our results show that focus of attention is correlated to who is speaking in a meeting and that it is possible to predict a person’s focus of attention based on the information of who is talking or was talking before a given moment. We have trained neural networks to predict at whom a person is looking, based on information about who was speaking. Using this approach we were able to predict who is looking at whom with 63% accuracy on the evaluation meetings using only information about who was speaking. We show that by using both head orientation and speaker information to estimate a person’s focus, the accuracy of focus detection can be improved compared to just using one of the modalities for focus estimation. To demonstrate the generality of our approach, we have built a prototype system to demonstrate focus-aware interaction with a household robot and other smart appliances in a room using the developed components for focus of attention tracking. In the demonstration environment, a subject could interact with a simulated household robot, a speech-enabled VCR or with other people in the room, and the recipient of the subject’s speech was disambiguated based on the user’s direction of attention.
Effective PPG sensor placement for reflected red and green light, and infrared wristband-type photoplethysmography
Using a wristband-type Photoplethymography (PPG) sensor, useful biomedical information such as heart rate and oxygen saturation can be acquired. Most of commercially-used wrist-type PPG sensors use green light reflections for its greater absorptivity of hemoglobin compared to other lights; this is important because wrists have comparably low concentration of blood flow. For reliable biomedical signal processing, we propose measurement sites for reflected red, green, infrared light PPG sensors on wrist. Amplitude, detection rate, and accuracy of heart rate are compared to determine the signal quality on measurement sites. Traditionally, wrist-type PPG sensors are implemented in measurement site 2, 3 or between 2 and 3 (between the distal Radius and the head of Ulna). Experiments show that all three reflected light PPG sensors generate good quality of PPG signals on measurement sites 4 and 11 (around the distal of Radius of left hand) in test subjects.
Dexmedetomidine as a novel countermeasure for cocaine-induced central sympathoexcitation in cocaine-addicted humans.
Cocaine-induced acute hypertension is mediated largely by increased central sympathetic nerve activity. We hypothesized that dexmedetomidine, a central sympatholytic, reverses cocaine-induced increases in sympathetic nerve activity, mean arterial pressure (MAP), and heart rate (HR) in cocaine-addicted subjects. First, we conducted a dose-finding study in 15 nontreatment-seeking cocaine-addicted subjects and 12 cocaine-naive healthy controls to find doses of intravenous dexmedetomidine that lower MAP and HR in the absence of acute-cocaine challenge. We then conducted a placebo-controlled treatment trial in 26 cocaine-addicted subjects to determine whether dexmedetomidine reverses MAP and HR increases after intranasal cocaine (3 mg/kg). Skin sympathetic nerve activity (measured in the second protocol) and skin vascular resistance (measured in both protocols) served as indices of cocaine-sensitive central sympathoexcitation. In doses up to 0.6 µg/kg IV, dexmedetomidine alone caused comparable dose-dependent decreases in blood pressure in cases and controls but a 1.0 µg/kg dose was required to lower HR. In cocaine-addicted subjects, low-dose dexmedetomidine (0.4 µg/kg; n=14) abolished cocaine-induced increases in skin sympathetic nerve activity (156 ± 26 versus -15 ± 22%, cocaine/placebo versus cocaine/dexmedetomidine; P<0.05), skin vascular resistance (+10 ± 2 versus -2 ± 3 U; P<0.05), and MAP (+6 ± 1 versus -5 ± 2 mm Hg; P<0.01) without affecting HR (+13 ± 2 versus +9 ± 2 bpm; P=ns). When dexmedetomidine was increased to 1 µg/kg (high dose; n=12) to reverse cocaine-induced increases in HR, MAP did not fall further and increased paradoxically in 4 of 12 subjects. Thus, in a low nonsedating dose, dexmedetomidine constitutes a putative new treatment for cocaine-induced acute hypertension but higher sedating doses can increase blood pressure unpredictably during acute-cocaine challenge and should be avoided.
Novel strategies in newborn screening for cystic fibrosis: a prospective controlled study.
CONTEXT Newborn screening for cystic fibrosis (CF) is included in many routine programmes but current strategies have considerable drawbacks, such as false-positive tests, equivocal diagnosis and detection of carriers. OBJECTIVE To assess the test performance of two newborn screening strategies for CF. DESIGN, SETTING AND PARTICIPANTS In 2008 and 2009, CF screening was added to the routine screening programme as a prospective study in part of The Netherlands. INTERVENTIONS Two strategies were performed in all newborns. In the first strategy, concentrations of immunoreactive trypsinogen (IRT) and pancreatitis-associated protein (PAP) were measured. In the second method, samples with IRT ≥60 μg/litre were analysed for 36 CFTR mutations, followed by sequencing when a single mutation was detected. Tests were positive only with two identified CFTR mutations. MAIN OUTCOME Sensitivity, specificity and positive predictive value (PPV) of both screening strategies. RESULTS 145,499 infants were screened. The IRT/PAP approach showed a sensitivity of 95.0%, a specificity of 99.897% and a PPV of 12.3%. Test properties for the IRT/DNA/sequencing strategy were respectively 100%, 100% and 64.9%. Combining both strategies (IRT/PAP/DNA/sequencing) led to a sensitivity of 95.0%, a specificity of 100% and a PPV of 87.5%. CONCLUSION In conclusion, all strategies performed well. Although there was no statistically significant difference in test performance, the IRT/DNA/sequencing strategy detected one infant that was missed by IRT/PAP (/DNA/sequencing). IRT/PAP may be the optimal choice if the use of DNA technology must be avoided. If identification of carriers and equivocal diagnosis is considered an important disadvantage, IRT/PAP/DNA/sequencing may be the best choice.
Varactor-Loaded Pattern Reconfigurable Array for Wide-Angle Scanning With Low Gain Fluctuation
An improved phased array using pattern reconfigurable antenna elements in this communication is proposed to realize wide-angle scanning performance with low gain fluctuation. The pattern reconfigurable element is a microstrip Yagi antenna with its parasitic strips loaded with varactors. The reconfigurability of radiation pattern from the element is enabled by tuning the capacitive reactance of the varactors. Five elements were arranged in an equally spaced linear array, and the effects of several key parameter variations on the radiation characteristics of the array are provided. The proposed array is fabricated and experimentally verified. It is seen from the measured results that the main beam of the array can scan from -70° to 70° in the H-plane with a gain fluctuation less than 2 dB. Meanwhile, the coverage of 3-dB beamwidth of the array is from about -87° to 87°. The agreement between simulated and measured results validates our design exhibiting a good performance of wide-angle scanning.
Exploration of Visual Data
One day, you will discover a new adventure and knowledge by spending more money. But when? Do you think that you need to obtain those all requirements when having much money? Why don't you try to get something simple at first? That's something that will lead you to know more about the world, adventure, some places, history, entertainment, and more? It is your own time to continue reading habit. One of the books you can enjoy now is exploration of visual data here.
Novel View Synthesis for Large-scale Scene using Adversarial Loss
Novel view synthesis aims to synthesize new images from different viewpoints of given images. Most of previous works focus on generating novel views of certain objects with a fixed background. However, for some applications, such as virtual reality or robotic manipulations, large changes in background may occur due to the egomotion of the camera. Generated images of a large-scale environment from novel views may be distorted if the structure of the environment is not considered. In this work, we propose a novel fully convolutional network, that can take advantage of the structural information explicitly by incorporating the inverse depth features. The inverse depth features are obtained from CNNs trained with sparse labeled depth values. This framework can easily fuse multiple images from different viewpoints. To fill the missing textures in the generated image, adversarial loss is applied, which can also improve the overall image quality. Our method is evaluated on the KITTI dataset. The results show that our method can generate novel views of large-scale scene without distortion. The effectiveness of our approach is demonstrated through qualitative and quantitative evaluation. . . .
Relation of therapeutic alliance and perfectionism to outcome in brief outpatient treatment of depression.
Prior analyses of the National Institute of Mental Health Treatment of Depression Collaborative Research Program demonstrated that perfectionism was negatively related to outcome, whereas both the patient's perception of the quality of the therapeutic relationship and the patient contribution to the therapeutic alliance were positively related to outcome across treatment conditions (S. J. Blatt, D. C. Zuroff, D. M. Quinlan, & P. A. Pilkonis, 1996; J. L. Krupnick et al., 1996). New analyses examining the relations among perfectionism, perceived relationship quality, and the therapeutic alliance demonstrated that (a) the patient contribution to the alliance and the perceived quality of the therapeutic relationship were independent predictors of outcome, (b) perfectionistic patients showed smaller increases in the Patient Alliance factor over the course of treatment, and (c) the negative relation between perfectionism and outcome was explained (mediated) by perfectionistic patients' failure to develop stronger therapeutic alliances.
Drug-Drug Interactions between Sofosbuvir and Ombitasvir-Paritaprevir-Ritonavir with or without Dasabuvir.
The combination of ombitasvir (an NS5A inhibitor), paritaprevir (an NS3/4A inhibitor) coadministered with ritonavir (r), and dasabuvir (an NS5B nonnucleoside polymerase inhibitor), referred to as the 3D regimen, and the combination of ombitasvir-paritaprevir-r, referred to as the 2D regimen, have demonstrated high efficacy with and without ribavirin in hepatitis C virus (HCV)-infected subjects. These regimens have potential for coadministration with sofosbuvir (nucleoside NS5B inhibitor) in the treatment of HCV. This phase 1, drug-drug interaction, open-label, multiple-dose study enrolled 32 healthy subjects to receive the 3D or 2D regimen in combination with sofosbuvir. Doses of study drugs were as follows: ombitasvir-paritaprevir-r, 25/150/100 mg daily (QD); dasabuvir, 250 mg twice daily (BID); and sofosbuvir, 400 mg QD. Blood samples were collected on study days 7, 14, and 21 for evaluating drug interaction at steady state. The effect of the 3D and 2D regimens on the pharmacokinetics of sofosbuvir and its circulating metabolite GS-331007 and vice versa was assessed by a repeated-measures analysis. Exposures of the 3D and 2D regimens were similar (≤20% change) during coadministration with sofosbuvir and during administration alone. Sofosbuvir exposures were 61% to 112% higher with the 3D regimen and 64% to 93% higher with the 2D regimen than with sofosbuvir alone. GS-331007 total exposures were 27% and 32% higher with the 3D and 2D regimens, respectively, than with sofosbuvir alone. Increases in sofosbuvir and GS-331007 exposures likely resulted from breast cancer resistance protein (BCRP) and/or P glycoprotein (P-gp) transporter inhibition by paritaprevir and ritonavir. No subjects discontinued the study due to study drug-related adverse events. No dose adjustment is recommended for 3D, 2D, or sofosbuvir in clinical trials exploring the safety and efficacy of the combination. (This study has been registered at ClinicalTrials.gov under registration no. NCT02356562 and NCT02292719.).
Adult women groomed by child molesters' heteronormative dating scripts
Synopsis Understanding the paradox of heteronormative power in which women are forced into subjectivity and simultaneously constructed with agency as they take up available discourses is critical for breaking women's silence and for responding to child safety concerns. This paper draws from multiple interviews with fourteen women who partnered with men they later knew were sexually abusing children. Transcripts were analyzed by applying feminist interpretations of Foucauldian discourse theory that searched for repetitions of discourse in the language that women used to describe the heteronormative dating scripts used by their partners. The women indicated, once child sexual abuse became known, that the same heteronormative discourses operated to shame, blame and silence them. These discursive pressures compelled the women to maintain facades that represented heteronormative relationship ideals, which served to increase the men's control over them and strengthen the men's ability to keep their sexual abuse of children secret.
Some statistical issues in the comparison of speech recognition algorithms
In the development of speech recognition algorithms, it is important to know whether any apparent difference in performance of algorithms is statistically significant, yet this issue is almost always overlooked. We present two simple tests for deciding whether the difference in error-rates between two algorithms tested on the same data set is statistically significant. The first (McNemar’s test) requires the errors made by an algorithm to be independent events and is most appropriate for isolated word algorithms. The second (a matched-pairs test) can be used even when errors are not independent events and is more appropriate for connected speech.
Mapping the world's photos
We investigate how to organize a large collection of geotagged photos, working with a dataset of about 35 million images collected from Flickr. Our approach combines content analysis based on text tags and image data with structural analysis based on geospatial data. We use the spatial distribution of where people take photos to define a relational structure between the photos that are taken at popular places. We then study the interplay between this structure and the content, using classification methods for predicting such locations from visual, textual and temporal features of the photos. We find that visual and temporal features improve the ability to estimate the location of a photo, compared to using just textual features. We illustrate using these techniques to organize a large photo collection, while also revealing various interesting properties about popular cities and landmarks at a global scale.
When Self-Pleasuring Becomes Self-Destruction: Autoerotic Asphyxiation.
Autoerotic asphyxia is presented in literature review form. Etiology, prevalence statistics, and a profile of AEA participants is provided. The author identifies autoerotic asphyxia as a form of sub-intentional suicide. Warning signs of AEA are presented. Possible sources of misinformation are given. Prevention and education recommendations for administrators, faculty, and parents are provided. A suggested reading list is provided.
Global refinement of random forest
Random forest is well known as one of the best learning methods. In spite of its great success, it also has certain drawbacks: the heuristic learning rule does not effectively minimize the global training loss; the model size is usually too large for many real applications. To address the issues, we propose two techniques, global refinement and global pruning, to improve a pre-trained random forest. The proposed global refinement jointly relearns the leaf nodes of all trees under a global objective function so that the complementary information between multiple trees is well exploited. In this way, the fitting power of the forest is significantly enhanced. The global pruning is developed to reduce the model size as well as the over-fitting risk. The refined model has better performance and smaller storage cost, as verified in extensive experiments.
Theoretical Framework for Plastic Waste Management in Ghana through Extended Producer Responsibility: Case of Sachet Water Waste.
Currently, use and disposal of plastic by consumers through waste management activities in Ghana not only creates environmental problems, but also reinforces the notion of a wasteful society. The magnitude of this problem has led to increasing pressure from the public for efficient and practical measures to solve the waste problem. This paper analyses the impact of plastic use and disposal in Ghana. It emphasizes the need for commitment to proper management of the impacts of plastic waste and effective environmental management in the country. Sustainable Solid Waste Management (SSWM) is a critical problem for developing countries with regards to climate change and greenhouse gas emission, and also the general wellbeing of the populace. Key themes of this paper are producer responsibility and management of products at end of life. The paper proposes two theatrical recovery models that can be used to address the issue of sachet waste in Ghana.
Hyperosmotic stress induces autophagy and apoptosis in recombinant Chinese hamster ovary cell culture.
During recombinant Chinese hamster ovary (rCHO) cell culture, various events, such as feeding with concentrated nutrient solutions or the addition of base to maintain an optimal pH, increase the osmolality of the medium. To determine the effect of hyperosmotic stress on two types of programmed cell death (PCD), apoptosis and autophagy, of rCHO cells, two rCHO cell lines, producing antibody and erythropoietin, were subjected to hyperosmotic stress resulting from NaCl addition (310-610 mOsm/kg). For both rCHO cell lines, hyperosmolality up to 610 mOsm/kg increased cleaved forms of PARP, caspase-3, caspase-7, and fragmentation of chromosomal DNA, confirming the previous observation that apoptosis was induced by hyperosmotic stress. Concurrently, hyperosmolality increased the level of accumulation of LC3-II, a widely used autophagic marker, which was determined by Western blot analysis and confocal microscopy. When glucose and glutamine concentrations were measured during the cultures, glucose and glutamine concentrations in the culture medium at various osmolalities (310-610 mOsm/kg) showed no significant differences. This result suggests that induction of PCD by hyperosmotic stress occurred independently of nutrient depletion. Taken together, autophagy as well as apoptosis was observed in rCHO cells subjected to hyperosmolality.
IMPROVED DROOP CONTROL FOR PARALLEL INVERTER SYSTEM WITH LOAD
DC-AC converters are electronic devices used to change DC direct current to alternating current. Three-phase inverter is widely used in power electronics system applications consequently, the DC-AC converters requires a controller with a high degree. Therefore the structure of two parallel three phase inverter with load system has presented. In order to achieve load sharing between parallel inverters, the linear transformed droop control is applied. This control strategy combines frequency and voltage droop method and inverter voltage regulation control scheme. In the external power control structure, the references frequency and magnitude of inverter output voltage are obtained according to the droop characteristics. The improvement of the droop control is made to obtain a more stable voltage and better load sharing between two parallel inverters. The performance of the control strategy is verified in simulation using Matlab/Simulink.