User Tools

Site Tools


wiki

UZH CCD Testing Setup (DAMIC-M / DAMIC)


Introdoction

This TWiki page is devoted to the functional description of the CCD testing setup at the Physics Institute of the University of Zurich and its associated hardware/software. A description of the cryostat, the associated electronics, and the instrumentation infrastructure is provided. The setup is based on the AlpineCube cryostat apparatus, an in-house design by the University of Zurich's Peter Robmann, and is currently located in building 36, floor H, room 78.

AlpineCube

Please register to be able to modify this page.

Developments for DAMIC-M

Below you find a list of tools that are/were developed at Zurich for the DAMIC-M experiment.

Setup

General information

Connection Table

Device Power Network Switch port Output connections
Output 1 Output 2 Output 3 Output 4
Server Multiprise Switch 1/UZH Network Eth1.1 UZH Network (eth4) ACM (eth5) Damic Intranet (eth6) LTA (eth7)
R&S HMP 2030 Multiprise Switch 1 Eth5.1 LTA (12V) Leach Fontend (+5V) Leach Fontend (-5V) N/A
R&S HMP 2030 Multiprise Switch 1 Eth8.1 ACM (-15V) ACM (+15V) ACM (-30V) N/A
Keithley 2470 SourceMeter Multiprise Switch 1 Eth4.1 Leach Front end VSUB N/A
VME8004X PDU N/A N/C ACM N/C
Lakeshore 335 Multiprise DS-700 N/A PT100 Cryohead PT100 Sample holder N/A
Netgear GS 108 (1) Multiprise Server 2 N/A N/A
Netgear GS 108 (2) Multiprise Switch 1 N/A N/A
Netio Power PDU 4PS Multiprise Switch 1 Eth3.1 VME8004X Orca Cryocooler Solenoid Valve Vacuum Pump
Single Gauge TPG 361 Multiprise Switch 1 Eth2.1 PKR 251 N/A
LTA HMP4040 Server 3 Server 3 N/A
Leach System Multiprise N/A ARC-66 left ARC-66 right N/A
ACM VME Server 4 Server 4 N/A
HiCube Vacuum Pump PDU Moxa Serial LAN N/A AlpineCube
Cryocooler PDU N/A Cryostat N/A
Bürkert W26A Solenoid Valve PDU N/A N/A
DS-700 Multiprise Switch 1 Eth7.1 Lakeshore 335 N/C N/A
Moxa Serial LAN Multiprise Switch 1 Eth6.1 HiCube Vacuum Pump N/A

Power Consumption

This table shows the maximal power consumption of our system. Some manufacturer only provided a VA measurement, for conversion we assumed a efficency factor of 0.8.

Instrument Power Concumption
R&S HMP2030 300 W
R&S HMP2030 300 W
Keithley 2470 SourceMeter 220 VA
Lakeshore 335 210 VA
RS300-E7/PS4 350 W
Netgear GS 108 10 W
NETIO PowerPDU 4PS 5 W
TPG 361 50 VA
Moxa Serial LAN 1 W
DS-700 5 W
Bürkert 0211 Solenoid Valve 4 W
Pfeiffer HiCube 30 170 W
ORCA Mixed Refrigerant Cooler 750 VA
Leach Readout 40 W
VME Crate (WV8004XVME00) 450 W
Total 3726 W

Network Configuration

Device IP Netmask Gateway DNS Hostname Username Password
Server Intranet 192.168.200.100 255.255.255.0 192.168.200.1 130.60.164.1 CCDTest damic 1rChEL
Server Internet 10.65.117.42
Power PDU 4PS 192.168.200.11 255.255.255.0 192.168.200.1 130.60.164.1 PowerPDU-CF admin 1rChEL@PDU
Single Gauge 192.168.200.12 255.255.255.0 192.168.200.1 130.60.164.1 N/A
Serial-LAN Adapter Vacuum 192.168.200.13 255.255.255.0 192.168.200.1 130.60.164.1 admin 1rChEL@Serial
HVPSU Kethley 192.168.200.14 255.255.255.0 192.168.200.1 K-2470 admin 1rChEL@HVPSU
LVPSU R&S HMP2030 192.168.200.15 255.255.255.0 192.168.200.1 admin 1rChEL@LVPSU
LVPSU R&S HMP2030 192.168.200.16 255.255.255.0 192.168.200.1 admin 1rChEL@LVPSU
USB-LAN adapter Lakeshore 192.168.200.17 255.255.255.0 192.168.200.1 130.60.164.1 Lakeshore admin 1rChEL
LTA 192.168.133.7 admin 1rChEL@USB24
ACM 192.168.1.5

Server

The main computer is an Asus RS300-E7/PS4 (RS300-E7-PS4/WOCPU/WOMEN/WOHDD) 1U single CPU model. The main processor is a single 6th generation Intel(R) Xeon(R) CPU E31220 @ 3.10GHz with 4 cores while the graphics subsystem is handled by an Nvidia Quadro 600 graphics card with 1 GB GDDR2 and 96 CUDA cores on a PCI Express x16 Gen2 bus. Four 4 GB DDR3 1333 ECC UDIMMs are instated for a main memory (RAM) of 16 GB with a maximum of 32 GB supported in a 4 x 8 GB dual channel configuration.

Server Specifications
Server 1 Server 2
Case Model RS300-E7/PS4 Product Page Manual RS300-E7/PS4 Product Page Manual
Mainboard ASUS P8B-E/4L Product Page Manual ASUS P7F-E Product Page Manual
Chipset Intel C204 chipset N/A Intel Q57 chipset N/A
CPU Xeon E31220 @ 3.10GHz Intel Xenon X3450 @ 2.67GHZ
Socket Type LGA1155 LGA1156
RAM 4 x 4 GB DDR3 1333 ECC UDIMM 6 x 4 GB DDR3 1333 ECC DIMM
Bios Version Version 6702 (latest) Latest Version Version 0404
Graphics Card Nvidia Quadro 600, 1 GB GDDR2 N/A N/A N/A
Network Quad Intel® GbE LAN controllers with IPv6 Intel® 82578DM Gigabit LAN controller
Hard Drive 4 x Hitachi Ultrastar 7K4000 0F14683 (4TB) Datasheet 4 x Seagate Constellation ES ST2000NM0011 (2TB) Datasheet

For storage, a 4 x 4TB hot-swap HDDs array, configured in RAID10 mode is used with the on-board intel RAID controller (Intel Rapid Storage Technology). This setup enables simultaneous writing to two hard drives (RAID1) and data mirroring to another two (RAID0). This configuration enhances writing speed and maintains a constant backup of the entire system. Each of the four 4TB hard drives contributes to an effective storage capacity of 8TB.

The system does not support UEFI, and access to full storage capacity is limited to operating systems supporting GPT architecture (>2TB volumes). The lack of UEFI limits Windows to MBR-only versions, and UNIX-type operating systems are recommended. For the operating system, Ubuntu 20.04.6 LTS was chosen due to its compatibility with LabVIEW (Q3 2022), the LTA and our image acquisition software. The default package manager is apt, while rpm is also present. For further information about our server please read the following pages.

Software

There are multiple Software's, programs and other files installed on this server a full list can be found following this link: Software

Instruments

Our instruments can be loosely grouped into 5 different groups:

Read out Systems

All implementation of CCD readout systems have a common architecture and comprised of four common functional elements: The picture below shows the generall outline of a CCD.

  • The first subsystem is devoted to providing the required bias voltages for the charge amplifiers located at the edge of the CCD matrix, the isolation voltages for the p-well and any additional bias voltages required for the operation of the serial registers at the periphery of the matrix. In addition, frequently the depletion/substrate voltage might be provided from the same subsystem, though requirements for increased stability and accuracy could lead to the use of a dedicated high precision external source (re. Keithley 2460/70 series).
  • The second subsystem is responsible for generating the various clocks needed for shifting charges along the matrix and performing readout operations. In a 4-quadrant CCD (four amplifiers, one in each corner), 3 horizontal and three Vertical clocks are needed per quadrant.The horizontal clocks move charge from colum to colum, while the vertical clocks move charge from row to row. Due to the symmetry of the shifting operations in the different quadrants, vertically aligned parts can share the same horizontal clocks and vise versa, reducing the total amount of required lines to half (12 instead of 24). As a result a total of 6 vertical and 6 horizontal clocks are needed. Another 10 clock signals are additionally required, 5 per vertical side (toggle gate, output gate, drain gate, register, switch) for complete implementation of the matrix readout.
  • The third subsystem takes care of the amplification and digitization of the analog signals provided by the CCD. This part might also be refereed as the video board/circuit term originating from the astronomy-related usage of CCDs for sky imaging.
  • The fourth and final part of any CCD readout system implements the communication protocol with the host computer and takes care of the eventual processing/compressing of recorded data. A combination of optical fivers, USB and/or Ethernet-based connections might be used for that purpose.

Three different readout options for skipper CCDs exist. The first is the leach system, developed by: Astronomical Research Cameras. The second option is a Low Threshold Acquisition (LTA) system, developed by Fermi lab for the Oscura experiment. The third option is using the ACM board, developed by the University of Chicago and LPNHE specifically for the DAMIC-M experiment. Information for each system are available in the following pages:

CCD Structures

The following pages display various information about the CCD's used in this setup:

wiki.txt · Last modified: 2024/10/07 15:42 by simon