Antiproton Yield Diagnostics for the Tevatron I Debuncher
During start-up of the CERN AA, many hours of machine experiments went into the study and optimization of antiproton yields. Those involved in the commissioning programme experienced the difficulty of tuning a new machine to accept a low-intensity full-aperture beam. The antiproton yield could only be obtained by integrating a slow Schottky scan of the beam on the injection orbit, normalized with respect to primary beam intensity by a charge transformer just in front of the production target. A precise yield measurement took about five minutes. At high yields this method permitted measurements to within a few percent. The slowness of the multi-parameter yield optimization, starting from low yields where the measurement errors were often as large as the gains to be made, cannot be over emphasized. In the Tevatron I Debuncher the antiproton yields should be substantially higher than at the AA and, given a Schottky pick-up of sufficient sensitivity, the situation looks more promising. At the AA we have resolved some of our difficulties by improving the charge transformer signal, speeding up the Schottky scan and adding instrumentation to use the signals from pions, muons and electrons injected along with the antiprotons. Low yields, e.g. at reduced aperture,more » are now measured using beam scrapers in conjunction with counters calibrated against the Schottky pick-up at high intensities. The latter is itself calibrated by the circulating beam current transformer at even higher intenSities, usually with protons in reverse polarity mode. Based on the AA experience we outline the techniques that could be used for the following measurements and procedures at the Debuncher: (1) antiproton yield (number of antiprotons circulating in the Debuncher per incident proton) versus the machine apertures 6X, 6y, and 6p, (2) yield versus phase space coordinates downstream from the production target, (3) use of other secondary particle fluxes, (4) optimization of full-aperture yield at the start of and during antiproton accumulation.« less
- Single Report
- 10.2172/948912
- Sep 13, 1985
The present visit to CERN was as a result of an invitation from Dr. Colin Johnson of the Antiproton Accumulator (AA) group. Two activities were planned for this visit. First, the second beam test of one of the original Fermilab lithium lenses (serial No.2). Second, the installation and beam tests for a new Fermilab lens of improved design (serial No.5). It should be mentioned here that CERN, after realizing the possible gains to be obtained, has started a considerable development effort in short focal length lenses. Presently they have 3 operational lithium lenses, transformers and power supplies for tests. They are in the process of constructing 3 other transformers and designing lenses of 4 cm diameter (twice the present Fermilab lenses). Fermilab should devote some added effort in the field to maintain the initiative. The first beam test of lens No.2 was performed during the summer of 1983, when the lens was used as an antiproton collecting lens. For this test the original lens was used as a strong focusing element in the 26 GeV proton beam in conjunction with a current carrying target Preliminary tests for this geometry were conducted during 1984, when the lens was exposed to over 2*E6 pulses at 320kAmps and 1.3*E13 protons per pulse. Lens No.5 was installed as an antiproton collecting lens, immediately following the AA production target, in a geometry similar to the one designed for the Tevatron 1 project at Fermilab. Targets of a different design than the one use normally at CERN were also required. After completion of the antiproton yield measurements and optimization the lens was left in the beam during regular operation for antiproton accumulation. During antiproton accumulation for the Lear accelerator new records were achieved on the accumulation yield and accumulation rate of antiprotons for the AA machine. CERN activities in the development of a Plasma Lens, as an alternative to the lithium ones, and new broad band pick-ups for stochastic cooling were observed as they are of great interests to Fermilab.
- Research Article
7
- 10.1109/tns.1983.4332951
- Aug 1, 1983
- IEEE Transactions on Nuclear Science
Antiprotons are produced for the CERN Antiproton Accumulator (AA) by focusing 26 GeV/c protons onto a 3 mm diameter, 11 cm long copper wire. Negatively charged particles with momenta about 3.5 GeV/c are focused by a short focal length coaxial horn and transported to the AA by a normal quadrupole focusing channel. The yield of antiprotons was found to be considerably less than anticipated (factor about 2) and the reason is presumed to be the assumption of too large a production cross-section in the original machine design proposal. Studies involving new horn design, introduction of an axial current e 150 kA) along the target and use of lithium lenses as an alternative to the magnetic horn are under way. Some preliminary measurements involving some of these techniques have been made, both to confirm the validity of calculations and to test the feasibility of building targets and focusing systems to withstand the mechanical forces and heat load due to the proton beam and the high pulsed currents.
- Conference Article
1
- 10.1109/pac.1989.72988
- Mar 20, 1989
The developmental work on plasma lens prototypes for antiproton collection is summarized. The antiproton yield with a plasma lens is estimated. Results of the latest z-pinch model describing the plasma dynamics in such a lens are presented. The scaling of the final plasma lens parameters is based on both model and measurement. Destruction rates of insulator tube and electrodes have been measured. A final set of parameters is proposed. Scaling from the experimental and numerical results shows that for the final Antiproton Collector plasma lens a new pulse generator is required, featuring a cycle time of more than 30 mu s (twice that of the present test generator) and a stored pulse energy of more than 25 kJ at 10-13-kV charging voltage. >
- Research Article
- 10.55124/jahr.v1i1.40
- Jun 25, 2021
- Journal of Advanced Agriculture & Horticulture Research
Growth and Yield Performance of Selected Wheat Genotypes at Variable Irrigation Management
- Conference Article
5
- 10.22323/1.395.0261
- Jul 6, 2021
The Latin American Giant Observatory (LAGO) is a distributed cosmic ray observatory at a regional scale in Latin America, by deploying a large network of Water Cherenkov detectors (WCD) and other astroparticle detectors in a wide range of latitudes from Antarctica to M\'exico, and altitudes from sea level to more than 5500 m a.s.l. Detectors telemetry, atmospherics conditions and flux of secondary particles at the ground are measured with extreme detail at each LAGO site by using our own-designed hardware and firmware (ACQUA). To combine and analyse all these data, LAGO developed ANNA, our data analysis framework. Additionally, ARTI, a complete framework of simulations designed to simulate the expected signals at our detectors coming from primary cosmic rays entering the Earth atmosphere, allowing a precise characterization of the sites in realistic atmospheric, geomagnetic and detector conditions. As the measured and synthetic data started to flow, we are facing challenging scenarios given a large amount of data emerging, performed on a diversity of detectors and computing architectures and e-infrastructures. These data need to be transferred, analyzed, catalogued, preserved, and provided for internal and public access and data-mining under an open e-science environment. In this work, we present the implementation of ARTI at the EOSC-Synergy cloud-based services as the first example of LAGO' frameworks that will follow the FAIR principles for provenance, data curation and re-using of data. For this, we calculate the flux of secondary particles expected in up to 1 week at detector level for all the 26 LAGO, and the 1-year flux of high energy secondaries expected at the ANDES Underground Laboratory and other sites. Therefore, we show how this development can help not only LAGO but other data-intensive cosmic rays observatories, muography experiments and underground laboratories.
- Research Article
4
- 10.3390/atmos15091039
- Aug 28, 2024
- Atmosphere
The Latin American Giant Observatory (LAGO) is a ground-based extended cosmic rays observatory designed to study transient astrophysical events, the role of the atmosphere on the formation of secondary particles, and space-weather-related phenomena. With the use of a network of Water Cherenkov Detectors (WCDs), LAGO measures the secondary particle flux, a consequence of the interaction of astroparticles impinging on the atmosphere of Earth. This flux can be grouped into three distinct basic constituents: electromagnetic, muonic, and hadronic components. When a particle enters a WCD, it generates a measurable signal characterized by unique features correlating to the particle’s type and the detector’s specific response. The resulting charge histograms from these signals provide valuable insights into the flux of primary astroparticles and their key characteristics. However, these data are insufficient to effectively distinguish between the contributions of different secondary particles. In this work, we extend our previous research by using detailed simulations of the expected atmospheric response to the primary flux and the corresponding response of our WCDs to atmospheric radiation. This dataset, which was created through the combination of the outputs of the ARTI and Meiga simulation frameworks, simulated the expected WCD signals produced by the flux of secondary particles during one day at the LAGO site in Bariloche, Argentina, situated at 865 m above sea level. This was achieved by analyzing the real-time magnetospheric and local atmospheric conditions for February and March of 2012, where the resultant atmospheric secondary-particle flux was integrated into a specific Meiga application featuring a comprehensive Geant4 model of the WCD at this LAGO location. The final output was modified for effective integration into our machine-learning pipeline. With an implementation of Ordering Points to Identify the Clustering Structure (OPTICS), a density-based clustering algorithm used to identify patterns in data collected by a single WCD, we have further refined our approach to implement a method that categorizes particle groups using advanced unsupervised machine learning techniques. This allowed for the differentiation among particle types and utilized the detector’s nuanced response to each, thus pinpointing the principal contributors within each group. Our analysis has demonstrated that applying our enhanced methodology can accurately identify the originating particles with a high degree of confidence on a single-pulse basis, highlighting its precision and reliability. These promising results suggest the feasibility of future implementations of machine-leaning-based models throughout LAGO’s distributed detection network and other astroparticle observatories for semi-automated, onboard and real-time data analysis.
- Preprint Article
- 10.5194/egusphere-egu24-10518
- Nov 27, 2024
Understanding the zenith angle dependence of the Martian surface radiation environment is crucial for planning future human exploration missions to Mars. In our previous research (Wimmer et al. 2015; Guo et al. 2021; Khaksarighiri et al. 2023) we extensively studied the zenith-angle dependence of the Martian surface radiation dose rate. Leveraging the same validated radiation model, calibrated with data from the Radiation Assessment Detector (RAD) on Mars, we calculated the flux of secondary downward particles reaching to the surface of Mars from various zenith angles resulting from the interaction of primary particles with the Martian atmosphere. These flux of secondary particles, coming from different zenith angles, can be integrated into a comprehensive topographic map of Mars, providing a detailed depiction of the global radiation landscape.The construction of this radiation map requires careful consideration of various factors, including atmospheric column density, local and large-scale topography offering potential shielding effects, and the input spectrum is affected by heliospheric modulation. Additionally, accounting for seasonal pressure cycles and daily atmospheric surface pressure due to thermal tides is essential. Our model specifically focused on the influence of zenith angle on atmospheric column depth and simulations tailored to the Gale Crater region, a region explored by the Curiosity rover. Applying this methodology allows us to create lookup tables of all secondary particles reaching the Martian surface from various zenith angles and evaluate the atmospheric impact. Employing these matrices alongside the incident spectrum enables the calculation of secondary particle flux from all zenith angles on the Martian surface.This method provides valuable insights into the fluctuations in radiation flux on Mars, facilitating thorough assessments of potential radiation hazards. Mission planners can leverage these data, obtaining vital information to identify secure landing areas and sheltered regions for astronauts on the Martian surface.
- Research Article
32
- 10.1016/j.algal.2018.11.002
- Nov 27, 2018
- Algal Research
Assessment of algal biofuel resource potential in the United States with consideration of regional water stress
- Conference Article
1
- 10.1063/1.36041
- Jan 1, 1986
Strong initiatives are being pursued in a number of countries for the construction of ‘‘kaon factory’’ synchrotrons capable of producing 100 times more intense proton beams than those available now from machines such as the Brookhaven AGS and CERN PS. Such machines would yield equivalent increases in the fluxes of secondary particles (kaons, pions, muons, antiprotons, hyperons and neutrinos of all varieties)—or cleaner beams for a smaller increase in flux—opening new avenues to various fundamental questions in both particle and nuclear physics. Major areas of investigation would be rare decay modes, CP violation, meson and hadron spectroscopy, antinucleon interactions, neutrino scattering and oscillations, and hypernuclear properties. Experience with the pion factories has already shown how high beam intensities make it possible to explore the ‘‘precision frontier’’ with results complementary to those achievable at the ‘‘energy frontier’’.This paper will describe proposals for upgrading and AGS and for building kaon factories in Canada, Europe, Japan and the United States, emphasizing the novel aspects of accelerator design required to achieve the desired performance (typically 100 μA at 30 GeV).
- Research Article
31
- 10.1016/0012-821x(80)90023-0
- May 1, 1980
- Earth and Planetary Science Letters
On the depth-dependent production of long-lived spallogenic53Mn in the St. Severin chondrite
- Research Article
13
- 10.1016/s0168-9002(00)01182-7
- Apr 1, 2001
- Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment
Development of a detector for bunch by bunch measurement and optimization of luminosity in the LHC
- Research Article
4
- 10.1016/s0273-1177(98)00070-2
- Jan 1, 1998
- Advances in Space Research
Distribution of energetic particles and secondary radiation according to orbital station “MIR” data obtained in 1991
- Research Article
- 10.1016/j.nima.2020.164299
- Jun 25, 2020
- Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment
Model uncertainty in accelerator application simulations
- Research Article
22
- 10.1016/j.asr.2009.11.009
- Nov 15, 2009
- Advances in Space Research
Solar proton events recorded in the stratosphere during cosmic ray balloon observations in 1957–2008
- Conference Article
- 10.22323/1.395.0487
- Aug 1, 2021
Atmospheric conditions affect the development of cascades of secondary particles produced by primary cosmic rays. Global Data Assimilation System, implementing atmospheric models based on meteorological measurements and numerical weather predictions, could significantly improve the outcomes of the simulations for extensive air shower. In this work, we present a methodology to simulate the effect of the atmospheric models in secondary particle flux at the Earth's surface. The method was implemented for Bucaramanga-Colombia, using ARTI: a complete computational framework developed by the Latin American Giant Observatory Collaboration to estimate the particle spectra on Water Cherenkov Detectors depending on the geographical coordinates. As preliminary results, we observe differences in the total flux that varies from month to month with respect to the subtropical summer atmospheric profile.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.