Abstract

The ALICE[1] Silicon Pixel Detector (SPD) [2] includes 120 detector modules each containing 10 pixel chips. Each pixel chip is capable of generating a FastOR signal indicating the presence of at least one pixel hit in the corresponding 8192 pixel matrix. The Pixel Trigger (PIT) [3][4] System has been implemented to process the 1200 Fast-Or signals from the SPD and to provide an input signal to the ALICE Central Trigger Processor (CTP)[5] for the fastest (Level 0) trigger decision within a latency of 800 ns. Working as a decision criteria for ALICE, the data flow need to be monitored carefully and status information needs to be made available. Therefore the PIT control system required an accurate design of hardware and software solutions to implement a coordinated operation of the PIT and the ALICE systems to which it interfaces. A driver layer was developed under stringent requirements of robustness and reusability. It qualifies as a general purpose hardware driver for electronic systems. It uses the ALICE Digital Data Link (DDL) [6] front end board (SIU) to communicate with the PIT hardware. We present here the design, and the implementation of the Pixel Trigger Front End Device (FED) Server [7]. I. The Pixel Trigger Control Hardware The pixel trigger hardware is composed of a main processing board where 10 mezzanine cards (Optin boards) are plugged. The Optin boards are capable of reading the output of 12 SPD detector modules (Optical Links). A PCI inspired control bus with transaction acknowledgement and late parity check is used to communicate between all on-board instances. A DDL was included as the communication medium for the control interface. Commands are received and status information is read back via a bridge between this device and the internal control bus. A second FPGA on the processing board is dedicated to the slow control, to the system interfaces and to the reconfiguration of the main processing FPGA. Status monitoring and control is implemented via registers in all the programmable devices available in the hardware. Remote programming of the processing FPGA is foreseen: the programming file for a given processing algorithm will be downloaded via the DDL link and the control FPGA to a local SRAM memory and then transferred to the Flash PROM connected to the processing FPGA. II. The PIT Control System The Pixel Trigger control system was designed to operate and control the pixel trigger hardware. It takes appropriate corrective actions to maintain the triggering stability and ensure the data quality. It is composed of two computers: a Linux PC to act as the driver layer of the system using the ALICE Data Acquisition (DAQ) standard hardware equipped with a SIU/DDL module to interface with the PIT electronics, a Windows PC which is the supervision layer of the system running CERN’s standard Supervisory Control and Data Acquisition (SCADA) framework, PVSS II [8]. The PIT control system is part of the ALICE Detector Control System (DCS) [9]. The architecture of the ALICE online software is strictly hierarchical. The main systems: DCS, DAQ, CTP and High Level Trigger (HLT) work as autonomous applications. At the highest level of this hierarchy is the Experimental Control System (ECS) [10], which controls all on-line applications. The global ALICE DCS is partitioned in sub-detectors that are seen as independent control systems integrated in a hierarchical Finite State Machine (FSM) [11]. III. The PIT FED Server The Pixel trigger FED Server is a software developed to act as the driver layer of the system. It uses a SIU/DDL to Figure 1: The model of Alice Detector Control Systems. Partitioning is based on sub-detectors. The ECS is the highest level instance with the control of all online applications. PIT DCS SPD DCS DCS ECS

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call