Hierarchical Scheduling of an SDF/L Graph onto Multiple Processors

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon

Although dataflow models are known to thrive at exploiting task-level parallelism of an application, it is difficult to exploit the parallelism of data, represented well with loop structures, since these structures are not explicitly specified in existing dataflow models. SDF/L model overcomes this shortcoming by specifying the loop structures explicitly in a hierarchical fashion. We introduce a scheduling technique of an application represented by the SDF/L model onto heterogeneous processors. In the proposed method, we explore the mapping of tasks using an evolutionary meta-heuristic and schedule hierarchically in a bottom-up fashion, creating parallel loop schedules at lower levels first and then re-using them when constructing the schedule at a higher level. The efficiency of the proposed scheduling methodology is verified with benchmark examples and randomly generated SDF/L graphs.

Similar Papers
  • Supplementary Content
  • Cite Count Icon 1
  • 10.17037/pubs.01856014
The association between the magnitude of T-cell interferon-gammaresponses to Mycobacterium tuberculosis specific antigens and risk ofprogression to tuberculosis in household contacts tested withQuantiFERON-TB Gold In-Tube Assay.
  • Jul 22, 2014
  • LSHTM Research Online (London School of Hygiene and Tropical Medicine)
  • Kwame Shanaube

The association between the magnitude of T-cell interferon-gammaresponses to Mycobacterium tuberculosis specific antigens and risk ofprogression to tuberculosis in household contacts tested withQuantiFERON-TB Gold In-Tube Assay.

  • Conference Article
  • Cite Count Icon 7
  • 10.1145/67386.67433
The heart of object-oriented concurrent programming
  • Jan 1, 1988
  • J Lim + 1 more

Concurrency has been with us almost from the beginning of computing. Managing and programming for concurrency is a difficult problem and various solutions have been suggested over the years. Debates on message passing vs. remote procedure call, synchronous vs. asynchronous message passing, bounded vs. unbounded buffers, active vs. passive objects etc. still continue. No solution is entirely satisfactory. Concurrent programming usually depends heavily on the nature of the problem at hand and the architecture of the target machine.

  • Conference Article
  • Cite Count Icon 1
  • 10.1109/dasip.2014.7115616
Self-adaptive harris corner detector on heterogeneous many-core processor
  • Oct 1, 2014
  • Johny Paul + 7 more

The recent years have shown the emergence of heterogeneous system architecture (HSA), which offers massive computational power assembled into a compact design. Computer vision applications with massive inherent parallelism highly benefits from such heterogeneous processors with on-chip CPU and GPU units. The highly parallel and compute intensive parts of the application program can be mapped to the GPU while the control flow and high level tasks may run on the CPU. However, they pose considerable challenge to software development due to their hybrid architecture. Sharing of resources (GPU or CPU) among applications running concurrently, leads to variations in processing interval and prolonged processing intervals leads to low quality results (frame drops) for computer vision algorithms. In this work, we propose resource-awareness and self organisation within the application layer to adapt to available resources on the heterogeneous processor. The benefits of the new model is demonstrated using a widely used computer vision algorithm called Harris corner detector. A resource-aware runtime-system and a heterogeneous processor were used for evaluation and the results indicate a well constrained processing interval and reduced frame-drops. Our evaluations demonstrate up to 20% improvements in processing rate and accuracy of the detected corner points for Harris corner detection.

  • Abstract
  • 10.1093/annonc/mdu327.37
291P - The Effect of the Level Expression of a Number of Molecular Parameters on the Achievement of Pathologic Complete Response in Patients with Triple-Negative Breast Cancer, Who Received Chemotherapy with Capecitabine
  • Sep 1, 2014
  • Annals of Oncology
  • O.D Bragina + 2 more

291P - The Effect of the Level Expression of a Number of Molecular Parameters on the Achievement of Pathologic Complete Response in Patients with Triple-Negative Breast Cancer, Who Received Chemotherapy with Capecitabine

  • Research Article
  • Cite Count Icon 7
  • 10.1177/0098628317727645
Effects of Higher and Lower Level Writing-to-Learn Assignments on Higher and Lower Level Examination Questions
  • Aug 23, 2017
  • Teaching of Psychology
  • Jeffrey S Nevid + 2 more

Our study examined whether brief writing-to-learn assignments linked to lower and higher levels in Bloom’s taxonomy affected performance differentially on examination performance in assessing these skill levels. Using a quasi-random design, 91 undergraduate students in an introductory psychology class completed eight lower level and eight higher level writing assignments. We based both higher and lower level writing assignments on the same concepts drawn from chapters of the accompanying textbook but which differed in level of cognitive complexity. The results favored a top-down approach by showing that higher level writing assignments produced significantly better performance on both lower and higher level exam questions derived from concepts students had written about.

  • Research Article
  • Cite Count Icon 3
  • 10.1145/67387.67433
The heart of object-oriented concurrent programming
  • Sep 26, 1988
  • ACM SIGPLAN Notices
  • J Lim + 1 more

The heart of object-oriented concurrent programming

  • Preprint Article
  • 10.5194/egusphere-egu21-12878
Influence of low-frequency variability on high and low groundwater levels: example of aquifers in northern France
  • Mar 4, 2021
  • Lisa Baulon + 4 more

<p>Groundwater fluctuations exhibit very often well-pronounced low-frequency variability (multi-annual to decadal timescales), linked to catchment and aquifer ability to smooth out rapid fluctuations from precipitation (low-pass filtering), especially when their characteristic time is long. This low-frequency variability, generated by large-scale climate variability and modulated by the physical properties of hydrosystems, is clearly imprinted in aquifers of northern France. Many recent researches addressed the issue of the capability of global climate models to reproduce low-frequency variability (most of the time multidecadal). For hydrological processes such as groundwater levels, which variance can be dominated by such low-frequency ranges, it may then appear crucial to provide assessment on how very high or very low levels are sensitive to such low-frequency variability. In this study, we investigate how low-frequency variability (from multi-annual to interdecadal timescales) may generate very high or very low groundwater levels (higher or lower than percentiles 80% and 20%, respectively). To test such hypotheses, our approach consists of breaking down groundwater level signals into timescale components using maximum overlap discrete wavelet transform in order to get wavelet details at different timescales. Multi-annual ~7 yr and interdecadal ~17 yr components appeared to be the dominant components of low-frequency variability of the signals. We then substracted these components (either one or both) and simply examined how many values remained over or below the selected threshold.</p><p>Results highlight that the number of events generated by low-frequency components is consistently closely linked to their contribution to groundwater level variability. Nearly 100% of high and low groundwater levels in inertial aquifers, that exhibit a large predominance of interdecadal variability, are generated by this timescale. At least 50% of high and low groundwater levels in inertial aquifers displaying a combination of interdecadal and multi-annual variabilities are generated by the combination of these two timescales. Finally, less than 50% of high and low groundwater levels in mixed aquifers (i.e. with a well pronounced low-frequency variability superimposed to annual variability) are generated by the multi-annual and interdecadal variabilities. In all studied aquifers with various dynamics, we notice a higher sensitivity of low groundwater levels to low-frequency variability than high groundwater levels.</p><p>Across aquifers of northern metropolitan France, particularly in the chalk of the Paris Basin, we observe quite a clear dependence of well-known historical high and low groundwater levels to low-frequency variability. In particular, the 2001 high levels and the 1992 low levels are seemingly generated by concomitant multi-annual and interdecadal high levels, and concomitant multi-annual and interdecadal low levels, respectively. On the other hand, the 1995 high levels and 1998 low levels are produced by a multi-annual high level attenuated by an interdecadal low level, and a multi-annual low level attenuated by an interdecadal high level, respectively. These phasings are also observed in precipitation and effective precipitation a few time in advance (ranging from 2 months to 1.5 years). Finally, the contribution of multi-annual and interdecadal variabilities to make the groundwater levels reach or exceed one selected threshold is directly influenced by their prominence in groundwater levels variability.</p>

  • Research Article
  • Cite Count Icon 25
  • 10.1016/s0022-3476(36)80028-6
A study of the comparative value of cod liver oil, viosterol, and vitamin D milks in the prevention of rickets and of certain basic factors influencing their efficacy
  • Sep 1, 1936
  • The Journal of Pediatrics
  • Martha M Eliot + 4 more

A study of the comparative value of cod liver oil, viosterol, and vitamin D milks in the prevention of rickets and of certain basic factors influencing their efficacy

  • Research Article
  • Cite Count Icon 47
  • 10.2214/ajr.178.2.1780497
Calcium scoring of the coronary artery by electron beam CT: how to apply an individual attenuation threshold.
  • Feb 1, 2002
  • American Journal of Roentgenology
  • Paolo Raggi + 2 more

Our aim was to assess the inter- and intraindividual variability of the attenuation threshold used to identify coronary artery calcification on electron beam CT and to illustrate a new threshold method. We measured the soft-tissue attenuation of regions surrounding the coronary arteries at the level of the left main coronary artery ostium (high level) and at the bottom of the heart (low level) in 48 consecutive patients (22 men, 26 women). Mean +/- 2 standard deviations (SD) of soft-tissue attenuation and variance of soft-tissue density and SDs were calculated at each level for every patient. It was assumed that setting an attenuation threshold greater than or equal to 3 SDs above that of soft tissue at each myocardial level would eliminate 99.5% of all scatter artifacts, allowing precise identification of calcific deposits. For the entire patient cohort, the average soft-tissue attenuation was 41 H and 35 H at the high and low levels, respectively (p < 0.01), indicating a large intraindividual variability. The SDs of soft-tissue attenuation measured by the computer software at the high and low levels were not different (26 H at the high level and 28 H at the low level; p = not significant). However, the calculated SD of the individual mean soft-tissue attenuation was 5 H at the high level and 8 H at the low level, again indicating a large intraindividual variability (p < 0.01). The addition of 3 measured SDs above the mean individual soft-tissue attenuation predicted a mean threshold of 120 and 121 H at the high and low levels, respectively, but with a wide interindividual variability (83-193 H at the high level and 79-242 H at the low level). There was a strong correlation between body weight and SD of soft-tissue attenuation at the low level (r = 0.75, p < 0.001) and a weaker but statistically significant correlation between weight and SD of soft-tissue attenuation at the high level (r = 0.51, p < 0.001). For the patients in this study, a threshold of 120 H for the detection of coronary calcification by electron beam CT seemed more appropriate than a threshold of 130 H, which is currently in use. However, given the great inter- and intraindividual variability, a biologic threshold tailored to the individual patient and to each individual imaging level should be used instead of a fixed threshold.

  • Research Article
  • Cite Count Icon 4
  • 10.1186/s41235-019-0200-5
Learning hierarchically organized science categories: simultaneous instruction at the high and subtype levels
  • Dec 1, 2019
  • Cognitive Research: Principles and Implications
  • Robert M Nosofsky + 2 more

BackgroundMost science categories are hierarchically organized, with various high-level divisions comprising numerous subtypes. If we suppose that one’s goal is to teach students to classify at the high level, past research has provided mixed evidence about whether an effective strategy is to require simultaneous classification learning of the subtypes. This past research was limited, however, either because authentic science categories were not tested, or because the procedures did not allow participants to form strong associations between subtype-level and high-level category names. Here we investigate a two-stage response-training procedure in which participants provide both a high-level and subtype-level response on most trials, with feedback provided at both levels. The procedure is tested in experiments in which participants learn to classify large sets of rocks that are representative of those taught in geoscience classes.ResultsThe two-stage procedure yielded high-level classification performance that was as good as the performance of comparison groups who were trained solely at the high level. In addition, the two-stage group achieved far greater knowledge of the hierarchical structure of the categories than did the comparison controls.ConclusionIn settings in which students are tasked with learning high-level names for rock types that are commonly taught in geoscience classes, it is best for students to learn simultaneously at the high and subtype levels (using training techniques similar to the presently investigated one). Beyond providing insights into the nature of category learning and representation, these findings have practical significance for improving science education.

  • Research Article
  • Cite Count Icon 11
  • 10.15017/6318
Multiple Clustered Core Processors
  • Mar 1, 2006
  • Kyushu University Institutional Repository (QIR) (Kyushu University)
  • Toshinori Sato + 3 more

This paper proposes multiple clustered core processors as a solution that attains both low power consumption and easy programming facility. Considering the current trend of increasing power consumption and temperature, a lot of CPU venders have shipped or announced to ship multiple core processors. Especially, recent studies on heterogeneous multiple core processors show that they are more efficient in energy utilization than homogeneous ones. However, they request programmers to consider complex task scheduling since the size of every task always has to match the performance of core where it is allocated. Multiple clustered core processors relieve them from such a tedious job. Simulation results show that a multiple clustered core processor consumes slightly more power than a heterogeneous multiple core processor. However, in a case, the heterogeneous multiple core processor cannot solve a severe task scheduling problem, while the multiple clustered core processor can.

  • Conference Article
  • Cite Count Icon 12
  • 10.1109/pic.2010.5687483
Transformation from Data Flow Diagram to UML2.0 activity diagram
  • Dec 1, 2010
  • Fanchao Meng + 2 more

Model transformations are frequently applied in business process modeling to bridge between languages on a different level of abstraction and formality This paper proposes a model transformation from Data Flow Diagrams (DFD) which have been used widely in structure requirement analysis phase to UML Activity Diagrams (UML-AD) which have been used widely in various phases of Object Oriented Development Method. First, we analyze the elements in DFD and the elements in UML-AD that can be transformed from DFD. Based on the corresponding relationships between DFD and UML-AD, the transformation rules from DFD to UML-AD are proposed. Finally, a case of textbook purchase and sale system is used to verify the feasibility and effectiveness. According to the proposed method, the business process models that are written by DFD can be easily transformed into the business process modes that are written by UM-AD, which realizes the interoperability between DFD and UML.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 30
  • 10.1074/jbc.m107107200
The Diet1 Locus Confers Protection against Hypercholesterolemia through Enhanced Bile Acid Metabolism
  • Jan 1, 2002
  • Journal of Biological Chemistry
  • Jack Phan + 3 more

The C57BL/6ByJ (B6By) mouse strain is resistant to diet-induced hypercholesterolemia and atherosclerosis, despite its near genetic identity with the atherosclerosis-susceptible C57BL/6J (B6J) strain. We previously identified a genetic locus, Diet1, which is responsible for the resistant phenotype in B6By mice. To investigate the function of Diet1, we compared mRNA expression profiles in the liver of B6By and B6J mice fed an atherogenic diet using a DNA microarray. These studies revealed elevated expression levels in B6By liver for key bile acid synthesis proteins, including cholesterol 7alpha-hydroxylase and sterol-27-hydroxylase, and the oxysterol nuclear receptor liver X receptor alpha. Expression levels for several other genes involved in bile acid metabolism were subsequently found to differ between B6By and B6J mice, including the bile acid receptor farnesoid X receptor, oxysterol 7alpha-hydroxylase, sterol-12alpha-hydroxylase, and hepatic bile acid transporters on both sinusoidal and canalicular membranes. The overall expression profile of the B6By strain suggests a higher rate of bile acid synthesis and transport in these mice. Consistent with this interpretation, fecal bile acid excretion is increased 2-fold in B6By mice, and bile acid levels in blood and urine are elevated 3- and 18-fold, respectively. Genetic analysis of serum bile acid levels revealed co-segregation with Diet1, indicating that this locus is likely responsible for both increased bile acid excretion and resistance to hypercholesterolemia in B6By mice.

  • Research Article
  • 10.34190/eccws.21.1.317
A Collaborative Design Method for Safety and Security Engineers
  • Jun 8, 2022
  • European Conference on Cyber Warfare and Security
  • Taito Sasaki + 2 more

The number of cyberattacks has been increasing not only on information systems but also on physical systems. Safety must be considered as an influence of cyberattacks. Vulnerabilities exploited in cyberattacks continue to occur day by day even if systems were developed securely. Security engineers must eliminate vulnerabilities even if the vulnerabilities occur after the developed systems are released. Vulnerabilities must be managed throughout system life cycle. But it takes time to apply its security patch. Safety engineers are required to ensure safety even when vulnerabilities exist. Therefore, collaboration between safety and security (S&amp;S) engineers is necessary to manage corresponding S&amp;S in operation process. S&amp;S should be considered simultaneously in early stage of development process. Collaborative discussion is useful to mitigating risk of reworks. It is an example of reworks by inadequate S&amp;S discussion that the braking system might be redesigned to promote the response in order to compensate for the delay caused by encryption. Therefore, this paper proposes common models effective for the collaboration throughout system life cycle. A management approach using the models is also proposed. Common model is represented by data flow diagram (DFD) because a module under cyberattacks can adversely affect other modules only through data flows. In the proposed method, the three improvements contribute to supporting management throughout system life cycle. Firstly, the models are applied to safety analysis and security analysis. Secondly, vulnerability occurrence is managed at the level of modules. System structures are designed based on modules. Module abnormalities caused by cyberattacks on the vulnerabilities are managed as causes of safety corruption. To indicate critical points for system to be considered, the points from a safety perspective must be identified. Processes and information are traced from the points in DFD. Finally, a module, which performs sets of functions, is outsourced. For each module, it must be considered who will manage vulnerabilities. The proposed method is illustrated using a development of a self-driving wheelchair as an example. In this paper, the collaborative design method for S&amp;S engineers of products and their management based on modules are described to ensure safety even when unexpected vulnerabilities exist.

  • Research Article
  • Cite Count Icon 69
  • 10.1002/wea.2469
The winter storms of 2013/2014 in the UK: hydrological responses and impacts
  • Feb 1, 2015
  • Weather
  • Katie Muchan + 3 more

This paper outlines the hydrological aspects of the 2013/2014 winter flooding in the UK, as well as the impacts. The episode is considered in a long-term historical context and wider issues raised by the flood events are discussed briefly.

Save Icon
Up Arrow
Open/Close