G Model
ARTICLE IN PRESS
FUSION-8810; No. of Pages 6
Fusion Engineering and Design xxx (2016) xxx–xxx
Contents lists available at ScienceDirect
Fusion Engineering and Design journal homepage: www.elsevier.com/locate/fusengdes
Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1) S. Pradhan ∗ , K. Mahajan, H.K. Gulati, M. Sharma, A. Kumar, K. Patel, H. Masand, I. Mansuri, J. Dhongde, M. Bhandarkar, H. Chudasama Institute for Plasma Research, Gandhinagar Gujarat, India
h i g h l i g h t s • The paper gives overview on SST-1 data acquisition and central control system and future upgrade plans. • The lossless PXI based data acquisition of SST-1 is capable of acquiring around 130 channels with sampling frequency ranging from 10 KHz to 1 MHz sampling frequency.
• Design, architecture and technologies used for central control system (CCS) of SST-1. • Functions performed by CCS.
a r t i c l e
i n f o
Article history: Received 18 June 2015 Received in revised form 10 June 2016 Accepted 15 June 2016 Available online xxx Keywords: SST-1 central control Data acquisition Network Real-time control Plasma control
a b s t r a c t Steady State Superconducting Tokamak (SST-1) has been commissioned successfully and has been carrying out limiter assisted ohmic plasma experiments since the beginning of 2014 achieving a maximum plasma current of 75 kA at a central field of 1.5 T and the plasma duration ∼500 ms. In near future, SST-1 looks forward to carrying out elongated plasma experiments and stretching plasma pulses beyond 1 s. The data acquisition and central control system (CCS) for SST-1 are distributed, modular, hierarchical and scalable in nature The CCS has been indigenously designed, developed, implemented, tested and validated for the operation of SST-1. The CCS has been built using well proven technologies like Redhat Linux, vxWorks RTOS for deterministic control, FPGA based hardware implementation, Ethernet, fiber optics backbone for network, DSP for real-time computation & Reflective memory for high-speed data transfer etc. CCS in SST-1 controls & monitors various heterogeneous SST-1 subsystems dispersed in the same campus. The CCS consists of machine control system, basic plasma control system, GPS time synchronization system, storage area network (SAN) for centralize data storage, SST-1 networking system, real-time networks, SST-1 control room infrastructure and many other supportive systems. Machine Control System (MCS) is a multithreaded event driven system running on Linux based servers, where each thread of the software communicates to a unique subsystem for monitoring and control from SST-1 central control room through network programming. The CCS hardware and software architecture is capable to fulfill the present operation and control requirement of SST-1. The CCS is successfully validated in several operation campaigns of SST-1 since year 2013. The lossless PXI based data acquisition of SST-1 is capable of acquiring around 130 channels of the ranging from 10 KHz to 1 MHz sampling frequency and is capable to acquire the volume of 500 Mbytes of data in each shot. The indigenously developed Matlab based software utility is being used to analyze the data. The complete system is also validated in several operation campaigns of SST-1 since year 2013. This paper will provide the overview of all the above mentioned subsystems of SST-1 DAQ and SST-1 CCS focusing on the design, architecture, performance, lesson learned and future upgrade plans. © 2016 Elsevier B.V. All rights reserved.
1. Introduction
∗ Corresponding author. E-mail address:
[email protected] (S. Pradhan).
The SST-1 [1,2] Central Control System [CCS] is based upon distributed hierarchical control system. It consist of Machine and Experiment Control at the top, following this are Discharge Con-
http://dx.doi.org/10.1016/j.fusengdes.2016.06.023 0920-3796/© 2016 Elsevier B.V. All rights reserved.
Please cite this article in press as: S. Pradhan, et al., Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1), Fusion Eng. Des. (2016), http://dx.doi.org/10.1016/j.fusengdes.2016.06.023
G Model FUSION-8810; No. of Pages 6 2
ARTICLE IN PRESS S. Pradhan et al. / Fusion Engineering and Design xxx (2016) xxx–xxx
Fig. 1. SST-1 Central Control System Configuration.
Fig. 2. SST-1 Subsystem interfacing.
trol, Timing systems, Access Control systems, Database system etc. Further, SST-1 core subsystems like Magnet Power Supplies, Vacuum System, Cryogenics System, ECRH, ICRH, LHCD systems etc are geographically dispersed. All SST-1 subsystems and CCS are connected with a high-speed 24-core fibre optic network and all are
synchronized by the VME based Timing system and is connected with GPS grandmaster clock. Timing System governs all the plant operations state transition including normal and emergency shutdown. In parallel to this, hardwired interlock between individual subsystems also monitors operational events of the plant and per-
Please cite this article in press as: S. Pradhan, et al., Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1), Fusion Eng. Des. (2016), http://dx.doi.org/10.1016/j.fusengdes.2016.06.023
G Model FUSION-8810; No. of Pages 6
ARTICLE IN PRESS S. Pradhan et al. / Fusion Engineering and Design xxx (2016) xxx–xxx
form preventive and protective action to maintain machine and subsystems in a safe state. CCS software is designed with event driven model, running on Linux and vxWorks based servers and single board computers. CCS software featured with real time monitoring and control for plasma position, shape and density control. The fastest hard real time control loop is 100 s for plasma position control. The SST-1 Data Acquisition systems (DAS), which with its heterogeneous composition and distributed architecture, aims to cover a wide range of slow to fast channels interfaced with a large set of diagnostics. The DAS also provides the essential user interface for data acquisition to cater both on and off-line data usage. The central archiving and retrieval service is based on a dual step architecture involving a combination of Network Attached Server (NAS) and a Storage Area Network (SAN). SAN based centralized data storage system of SST-1 has envisaged to address the need of availability of experimental data centrally for archival as well as retrieval. Considering an initial experimental data volume, ∼10TB capacity of SAN based data storage system has configured/installed with optical fiber backbone with compatibility considerations of existing Ethernet network of SST-1. For safety and surveillances of man and machine, CCS is equipped with fire alarm, access control, machine vision system, and voice communication system. The central control system is designed with isolated and stabilized power with a separate ground pit and equipotential mesh and no electrical cables enters to CCS or goes out. CCS software and hardware are successfully in operation since last XII SST-1 experiment campaign. 1.1. Control system requirements • The control system objectives of SST-1 can be summarized as follows: • Proper sequencing of various subsystem of tokamak • Remote monitoring and control of system elements • Ease of use via graphical user interface • Detection of anomalous operational modes, alerting the users of such situations and, where possible, executing automatic procedures to ensure personnel and machine protection. • Archiving operational parameters for future restoration and analysis • Providing support for executing fast real time tasks of the order of 100 s to 1 ms scale • Providing support for real time visualization of some plasma parameters • Consulting and co-ordination of subsystems control development efforts, including the tools and interface standards definitions for machine. • The central control support for controlled access to tokamak hall, safety interlock system, the control room facility, machine vision, voice communication equipment and the networks. From connectivity and functionality, the system is classified into following groups as shown in Fig. 1. 2. Machine and experiment control system In CCS, a state diagram is devised to make the SST-1 subsystems ready for the operation and subsequently executing the experiments on SST-1 [6]. MCS implements the CCS state diagram. It is multithreaded and an event-driven application in which each thread communicates to an individual subsystem via its specific interface. Any subsystem communication failure or subsystem’s local error would not prohibit the execution of MCS and in-turn
3
the CCS operation. Provisions have been made to test individual subsystem and if needed any subsystem can be bypassed dynamically. MCS ensures the software interlock between the subsystems and the CCS. It continuously monitors the subsystem’s status and their vital process parameters during experiment campaign. It also provides the platform for the operator to visualize and exchange operational & experimental configuration parameters with the subsystems interfaced with CCS. The MCS is a modular, flexible and a scalable system. MCS [6] remains operational 24 × 7 from the commencement to the termination of the SST-1 experiment campaign. A set of high level commands and several protocols have been devised for establishing robust communication between MCS and SST-1 subsystems. Communications are carried out either directly with the subsystems controller or with the host PC of the Sub-system. The developed MCS has performed robustly and flawlessly during all the last campaigns of SST-1 (Fig. 2). 3. Timing system 3.1. Equations VME based timing system [4] is a distributed system with fibre optic connectivity. The system has five types of modules: 3.2. Master control module It generates and distributes a master clock of 20 MHz. The whole operation sequence is defined in term of time stamped triggers and stored in RAM of the module. It distributes synchronous as well as asynchronous events and also maintains the log of all the events with time stamp. 3.3. Asynchronous event encoder module This module installed at subsystem node. It communicates with master node. It has 16 asynchronous event channels through lemo connector to notify asynchronous events occur at subsystem side. 3.4. Timer module Each timer module has four output channels. Each channel has 64KX16 bit RAM. Each RAM location is mapped with an event. Each events can be mapped according to user needs to produce signal for peripheral devices. There is a feature to generate interrupt for defined events or mask the interrupts. 3.5. Synchronous event encoder module It receives software-generated events from VME host CPU and broadcasts to Master Control Node. It has full duplex fiber optic link to transceiver module. 3.6. TX/RX module It communicates with slave nodes over optical fibre. It has two slave nodes per module (Fig. 3). 4. Discharge control Primary objective of SST-1 Plasma control system [10] is to achieve Plasma position, shape and current profile control. Architecture of control system for SST-1 is distributed in nature. Fastest control loop time requirement of 100 s is achieved using VME based simultaneous sampling ADCs, PMC based quad core DSP,
Please cite this article in press as: S. Pradhan, et al., Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1), Fusion Eng. Des. (2016), http://dx.doi.org/10.1016/j.fusengdes.2016.06.023
G Model FUSION-8810; No. of Pages 6 4
ARTICLE IN PRESS S. Pradhan et al. / Fusion Engineering and Design xxx (2016) xxx–xxx
Fig. 3. Timing System Architecture.
Fig. 4. Plasma Position Feedback loop.
Reflective Memory [RFM] based real-time network and VME based real-time trigger distribution network and Ethernet network. All the control loops for shape control, position control and current profile control share common signals from electromagnetic (EM) diagnostic so it is planned to accommodate all the algorithms on the same PMC based quad core DSP module TS C-43. RFM based real-time data network replicate data from one node to next node in a ring network topology at sustained throughput rate of 13.4 Mbps. Real-time Timing System network provides guaranteed trigger distribution in 3.8 us from one node to all node of the network. Monitoring and configuration of different systems participating in the operation of SST-1 are done by Ethernet network (Fig. 4). Magnetic sensors data is acquired using Pentek 6802 simultaneously sampling ADC card at the rate of 10KSPS. All the real-time raw data along with the control data is archived using RFM network and SCSI HDD for the experiment duration of 1000 s. RFM network is also planned for real-time plotting of key parameter of Plasma
during long experiment. After experiment this data is transferred to central storage server for archival purpose. 5. Data acquisition system The Data Acquisition System (DAS) [3] acquires around 130 channels from different SST-1 diagnostics and its subsystems [9]. The PXI based DAS and CAMAC based DAS have been chosen to cater the need, with sampling rates varying from 10 K samples/sec to 1 M samples/sec. The associated DAS software essentially addresses the requirement of an active initialization of acquisition from SST-1 central control and, subsequently, pushing the sampled data into the NAS for data archival using Ethernet. The focus is on reliable and loss-less data acquisition for various slow and fast SST-1 diagnostics channels along with synchronization of DAS elements with central timing system. The additional aspects of the system includes a service subsystem responsible for data retrieval and analysis of diagnostics data, this integrates a facility to view diag-
Please cite this article in press as: S. Pradhan, et al., Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1), Fusion Eng. Des. (2016), http://dx.doi.org/10.1016/j.fusengdes.2016.06.023
G Model FUSION-8810; No. of Pages 6
ARTICLE IN PRESS S. Pradhan et al. / Fusion Engineering and Design xxx (2016) xxx–xxx
5
Fig. 5. SST-1 experiment analysis plot.
nostics/subsystem signals across the network using a centralized Matlab based plotting and analysis tool for the SST-1 diagnostics data. SST-1 Data Acquisition Systems have been reliably operated in the SST-1 experimental campaigns (Fig. 5). 6. SST-1 central storage system The SAN based data storage system [5] has been designed/configured with 3-tiered architecture and GFS cluster file system with multipath support. Tier-1 is of ∼3TB (frequent access and low data storage capacity) comprises of FC based hard disks for optimum throughput. Tier-2 is of ∼6TB (less frequent access and high data storage capacity) comprises of SATA based hard disks. Tier-3 will be planned later to store offline historical data. A tightly coupled storage servers (with cluster configuration) are working together to increase performance, reliability, distribute workload and provide access to the files from any server regardless of the physical location of the file. Different RAID (Redundant Array of Independent Disks) groups are created with both, FC and SATA hard disks to increase reliability, security and performance of the storage system. The adopted SAN based data storage for SST-1 is a modular, robust, and allows future expandability. The storage modules can be added as and when required without changing the existing storage architecture. The data read/write time is adequate enough to cater the present throughput requirements of individual subsystems. Sufficient redundancy in terms of hardware has incorporated to assure the uninterrupted experimental data availability 24 × 7 within intranet. The designed storage system can be expanded up to ∼100 TB of capacity with existing controller pair. 7. Central control infrastructure The central control infrastructure is composed of networking system, audio/video systems and control room facilities for assisting SST-1 operation campaign. Un-interruptible Power Sup-
ply system is installed so that critical control room computers, equipment, or systems will be available 24 × 7. A false floor has been laid down in control room for structured cabling and electric power management. The control room is maintained electrically isolated from rest of the SST-1 building by providing star fashioned electrical grounding. Access control system is installed throughout the machine area which is providing controlled access to the site. 7.1. Networking system The SST-1 networking system physically connects all these subsystems. The different commands, events, data and signals travel over this network. The network has ∼120 nodes scattered in different buildings. The network is implemented using 1500 m optical fiber cable with 625 numbers of terminations. This network includes both 6 and 24 core multi-mode fiber optic backbone cable with 62.5/125 micron core cladding ratio and 850 nm wavelength. The same backbone is being used for 100 Mbps Ethernet, real time high-speed reflective memory networks, timing system networks and different audio/video networks. The data from different local controllers to the central storage is also transferred over this network. 7.2. Audio video systems The SST-1 audio visual system [8] is installed to enhance the capability of the surveillance of SST-1 machine and its sub-systems. The complete system is bifurcated in to two networks, viz. the machine vision and the audio conferencing. Both the networks are deployed using distinct star topology real-time optical fiber backbone networks. The audio conferencing PA system comprises of in-house customized main control panel, 32 numbers of fiber optic audio transceiver modules, two wireless access points, 18 active loud-speakers and 12 paging microphones installed at various subsystems in and around SST-1 building. The system is designed in
Please cite this article in press as: S. Pradhan, et al., Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1), Fusion Eng. Des. (2016), http://dx.doi.org/10.1016/j.fusengdes.2016.06.023
G Model FUSION-8810; No. of Pages 6
ARTICLE IN PRESS S. Pradhan et al. / Fusion Engineering and Design xxx (2016) xxx–xxx
6
such a way that any end terminals can initiate and send the audio messages to rest of all the active nodes at the push of button. The machine vision system is incorporated using 8 channel network DVR, remotely controllable eight cameras, 12 numbers of composite video signal modulators, modulated channels multiplexer, wideband VHF hybrid amplifiers, RF signal distributors and total 12 TV monitors. The standby machine vision system is also established by deploying four channel camera controller, modular video distribution amplifier, PCI video grabber card and quad video compressor. The system provides live visual streaming video data to central control room and other multiple locations in IPR campus for monitoring purpose. Particular camera channel view is user selectable at the end terminals.
9. Summary The systems hardware and software architecture fulfill the objectives of SST-1 central control system. The central control system is indigenously designed, developed, tested, and implemented at IPR. Timing System incorporates the necessary control logic to ensure the correct operation of data acquisition & control systems that have distributed architecture. Acknowledgment The authors would like to thank all the SST-1 team members for their support in successfully carrying out the reported task. References
7.3. Control room facilities SST-1 CC is equipped with control panel having 8 authorized operator consoles, visual indicators, switches for control and monitoring, 10 user workstations for experimental data analysis, experimental data storage hardware with a present capacity of 10TB, large displays which include 3 projectors and 2 LED TVs providing projections for continuous visual monitoring of essential parameters, centralized audio system with capability of connecting 16 spatially distributed subsystems in the periphery of 300 m, intercom facility, surveillance cameras, Tokamak machine monitoring cameras and access control-monitoring system etc. [7]. 8. CCS upgrade plan The Data Acquisition and Control Systems for SST-1 were designed around 15 years before. Many of these systems have approached towards technological obsolescence. Principles have stopped providing support for these components. Many of the components have reached towards end of life. The backbone of the SST-1 CCS network will be upgraded for 10Gbps bandwidth using mono-mode fiber optic cable and appropriate network switches. Several technologies are being evaluated to select the real-time Network which includes white-rabbit, 10Gbps Ethernet, shared memory based solutions. Several technologies is being evaluated for timing system also [4].
[1] Subrata Pradhan, et al., IAEA FEC 2014. First experiments in SST-1, Nuclear Fusion 55 (10) (2015) 104009. [2] Aveg Chauhan, et al., 9th Asia Plasma and Fusion Conference (APFA)-2013, At Gyeongju, S. Korea. Central Control System for Operation and Control of SST-1, 2013. [3] Manika Sharma et al., 10th IAEA TM, INDIA Overview of Data Acquisition System for SST-1 Diagnostics. [4] Aveg Kumar et al., 10th IAEA TM, INDIA Overview of Time Synchronization System of Steady State Superconducting Tokamak SST-1. [5] Manisha Bhandarkar et al., 10th IAEA TM, INDIA Archiving and retrieval of experimental data using SAN based centralized storage system for SST-1. [6] Harish Masand et al., 10th IAEA TM, INDIA Design, architecture and operational experience of Machine Control System [MCS] for SST-1. [7] Jasraj Dhongde et al., 10th IAEA TM, INDIA SST–1 Central Control Infrastructure: An Overview. [8] Hitesh Chudasama, et al., 10th IAEA TM, INDIA Overview of data acquisition and central control system of steady state superconducting tokamak (SST-1). [9] Pankaj Varmora, et al., 10th IAEA TM, INDIA Data acquisition system for helium mass flow measurement in SST-1 Magnets. [10] Kirit Patel et al., 10th IAEA TM, INDIA Design and Architecture of SST-1 basic plasma control system.
Please cite this article in press as: S. Pradhan, et al., Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1), Fusion Eng. Des. (2016), http://dx.doi.org/10.1016/j.fusengdes.2016.06.023