software
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
software [2025/01/06 01:55] – holtz | software [2025/05/10 18:09] (current) – holtz | ||
---|---|---|---|
Line 4: | Line 4: | ||
Overall software design is to control all components from a main server computer, song1m, located in | Overall software design is to control all components from a main server computer, song1m, located in | ||
- | the main computer room, running Linux. This communicates with the telescope/ | + | the main computer room, running Linux. This communicates with the telescope/ |
Devices can also be controlled locally from the computer to which they are attached, as this is useful for lower level functionality, | Devices can also be controlled locally from the computer to which they are attached, as this is useful for lower level functionality, | ||
Line 13: | Line 13: | ||
===== Control software ===== | ===== Control software ===== | ||
- | A full set of commands that communicate with all of the component devices is implemented in the aposong.py module in the APOsong repository ([[https:// | + | A full set of Python |
Given that all of the devices are on the internal APO network, the main control computer, song1m, is located on this network, and remote operation can be done using VNC with an openVPN or tunneling connection. On song1m, the software environment is set up for the song user. | Given that all of the devices are on the internal APO network, the main control computer, song1m, is located on this network, and remote operation can be done using VNC with an openVPN or tunneling connection. On song1m, the software environment is set up for the song user. | ||
- | The control software is run from a Python environment. APOsong provides | + | An interactive client can be run from a Python environment. APOsong provides |
To access the command set: | To access the command set: | ||
Line 30: | Line 30: | ||
and you can use help(command) or command? to see the docstrings. | and you can use help(command) or command? to see the docstrings. | ||
- | Documentation is also available at http:// | + | This will also provide instantiations of all of the Alpaca devices: Telescope as T, Dome as D, Safetymonitor as S, Cameras as C[], Switches as SW[], Focussers as F[]. |
+ | |||
+ | Documentation is available at http:// | ||
===== Robotic software ===== | ===== Robotic software ===== | ||
- | Alternatively, | + | Alternatively, |
+ | For SONG operation, robotic operations will be conducted through Postgres database syncs from Aarhus, details TBD when this is implemented ... | ||
+ | |||
+ | ===== song1m computer ===== | ||
+ | |||
+ | This is the main computer interface. It is a virtual machine running on a box in the APO server room. Use the song account to log in. Once all remote software is started (see pwi1m, spec1m, and dome1m): | ||
+ | < | ||
+ | cd APOsong | ||
+ | ipython | ||
+ | > from aposong import * | ||
+ | </ | ||
+ | |||
+ | This should show you a list of commands (which you can see again with the commands() function), and a list of connected devices. | ||
===== pwi1m computer ===== | ===== pwi1m computer ===== | ||
Line 45: | Line 59: | ||
* several ASCOM drivers for telescope, port 1 focuser, and mirror cover control | * several ASCOM drivers for telescope, port 1 focuser, and mirror cover control | ||
* a custom PlaneWave API interface | * a custom PlaneWave API interface | ||
- | * ASCOM remote | + | * ASCOM remote |
+ | * PWI-4 interface | ||
+ | * PlaneWave Focuser (via PWI4) | ||
+ | * PlaneWave Mirror Cover (via PWI4) | ||
+ | * Atik guide camera | ||
+ | * Atik eShel camera | ||
+ | * QSI camera (for PlanWave port 1) | ||
+ | * QSI FilterWheel | ||
* A custom Alpaca server which supports Alpaca drivers for: | * A custom Alpaca server which supports Alpaca drivers for: | ||
- | *the Zaber focusing stage (Zaber XX) through an ASCOM Focuser device, | + | * the Zaber focusing stage (Zaber XX) through an ASCOM Focuser device, |
- | *the iodine stage (Thorlabs TLS 150 stage), commanded through an ASCOM Focuser device, | + | * the calibration stage (Thorlabs LTS 150 stage), commanded through an ASCOM Focuser device |
+ | * the iodine stage (Thorlabs TLS 150 stage), commanded through an ASCOM Focuser device, | ||
* the iodine cell temperature controller (Thorlabs TC300 controller), | * the iodine cell temperature controller (Thorlabs TC300 controller), | ||
| | ||
+ | |||
+ | PlaneWave interface 4 is started from a desktop icon. It will open (after a minute or two) a graphical window (more details below). Need to ensure that telescope is connected and all motors/ | ||
+ | |||
+ | ASCOM remote is stated from a desktop icon. It may need to be restarted if power is cycled to any of the devices that it interfaces to. It takes a minute or two to start and should show connections to all of the devices listed above. | ||
+ | |||
+ | The Alypca server is started from an Anaconda Python shell (start using Windows input). Once the terminal window comes up: | ||
+ | < | ||
+ | cd APOalpyca/ | ||
+ | python fpu_app.py | ||
+ | </ | ||
+ | |||
+ | The system should then be ready for remote usage. | ||
+ | |||
=== Planewave software === | === Planewave software === | ||
+ | |||
+ | The PlaneWave software is not normally used for regular operation, as we just use the ASCOM and direct PWI interface. However, more detailed information and more command functionality is available through the PWI graphical interface, e.g., for pointing models and telescope troubleshooting. Hence we provide some information here. | ||
On pwi1m, the telescope is controlled through the PWI4 software. This also has the capability to control an ASCOM camera, necessary when using PWI4 to construct a pointing model. | On pwi1m, the telescope is controlled through the PWI4 software. This also has the capability to control an ASCOM camera, necessary when using PWI4 to construct a pointing model. | ||
Line 88: | Line 125: | ||
=== Alpaca devices === | === Alpaca devices === | ||
- | Several devices are controlled via ASCOM Alpaca, as implemented in the [Alpyca](https:// | + | Several devices are controlled via ASCOM Alpaca, as implemented in the [[https:// |
- | * ASCOM [Focuser](https:// | + | * ASCOM [[https:// |
- built using Thorlabs 150 https:// | - built using Thorlabs 150 https:// | ||
- built using Zaber stage https:// | - built using Zaber stage https:// | ||
- | * ASCOM [Switch](https:// | + | * ASCOM [[https:// |
- TC300 https:// | - TC300 https:// | ||
- | - Shelyak eShel calibration unit, built using K8056 rouginte | + | - Shelyak eShel calibration unit, built using K8056 routine |
- | ===== Dome software | + | Documentation of the Alpaca drivers can be found at [[https:// |
+ | |||
+ | ===== spec1m computer ===== | ||
+ | |||
+ | spec1m is the SONG spectrograph computer. It controls the spectrograph QHY detector and the Esatto focuser. It also controls two thermocouples that monitor the body temperature of the QHY and a relay that resets a watchdog timer if the body temperature is under a limit; otherwise the watchdog is not reset and the QHY camera power will be cut after 2-3 minutes. | ||
+ | |||
+ | Run ASCOM remote for QHY interface and Esatto focuser interface. | ||
+ | |||
+ | Run Alpyca server for thermocouple interface from an Anaconda terminal shell: | ||
+ | < | ||
+ | cd APOalpyca/ | ||
+ | python spectro_app.py | ||
+ | </ | ||
+ | |||
+ | Since the QHY power is controlled by the thermocouple interface, the ASCOM remote interface won't connect until this is run: it will prompt you to connect the ASCOM device within 10 seconds. | ||
+ | |||
+ | ===== dome1m | ||
Dome control is implemented through software running on a Raspberry Pi, dome1m. | Dome control is implemented through software running on a Raspberry Pi, dome1m. | ||
Low-level functions are implemented in the APOAshDome.py module. An ASCOM Alpaca interface was implemented using the templates from the [[https:// | Low-level functions are implemented in the APOAshDome.py module. An ASCOM Alpaca interface was implemented using the templates from the [[https:// | ||
- | Software is archived at [[https:// | + | Software is archived at [[https:// |
APOAshDome controls dome shutter and rotation through relays, reads dome position through an encoder, and also reads dome home switch. For the encoder, the pulses are read using the pigpiod daemon, so that must be installed and started every time the computer is rebooted: | APOAshDome controls dome shutter and rotation through relays, reads dome position through an encoder, and also reads dome home switch. For the encoder, the pulses are read using the pigpiod daemon, so that must be installed and started every time the computer is rebooted: | ||
Line 110: | Line 163: | ||
This has been implemented in / | This has been implemented in / | ||
- | The dome software can be run locally from a python interface: from APOAshdome/device, start python and import APOAshdome. A Dome object can be instantiated through which properties and methods can be accessed: | + | The dome software can be run locally from a python interface: from APOalpyca/device, start python and import APOAshdome. A Dome object can be instantiated through which properties and methods can be accessed: |
< | < | ||
import APOAshDome | import APOAshDome | ||
- | D = APOAshDome.Dome() | + | D = APOAshDome.APOAshDome() |
... | ... | ||
</ | </ | ||
- | For documentation of the methods, see [[https://apoashdome.readthedocs.io/ | + | For documentation of the methods, see [[https://apoalpyca.readthedocs.io/ |
- | To start the Alpaca | + | dome1m also runs an ASCOM Alpaca |
- | < | + | |
- | cd APOAshdome/ | + | |
- | python app.py | + | |
- | </code> | + | |
- | dome1m also runs an ASCOM Alpaca SafetyMonitor, | + | The Alpyca servers |
- | + | ||
- | To start the Alpaca | + | |
< | < | ||
- | cd APOSafety/device | + | cd APOalpyca/device |
- | python | + | python |
</ | </ | ||
- | |||
- | |||
Line 184: | Line 229: | ||
===== Linux installation ===== | ===== Linux installation ===== | ||
+ | |||
+ | < | ||
+ | dnf install tcsh emacs git gnome-tweaks tigervnc-server gcc g++ strongswan | ||
+ | </ | ||
+ | |||
+ | change selinux to Permissive and reboot | ||
+ | |||
+ | < | ||
+ | useradd --uid 1013 -d /home/song song | ||
+ | passwd song | ||
+ | add song user into / | ||
+ | systemctl enable --now vncserver@: | ||
+ | firewall-cmd --permanent --add-service=vnc-server | ||
+ | firewall-cmd --reload | ||
+ | |||
+ | </ | ||
+ | |||
+ | set up ipsec.secrets, | ||
+ | < | ||
+ | systemctl enable --now strongswan | ||
+ | </ | ||
anaconda | anaconda | ||
+ | install | ||
+ | | ||
+ | add ssh-key | ||
+ | < | ||
+ | git clone APOsong.git | ||
+ | cd APOsong | ||
+ | pip install -e . | ||
+ | </ | ||
alpyca, tkthread, importlib_resources, | alpyca, tkthread, importlib_resources, | ||
| | ||
postgres 13 | postgres 13 | ||
+ | < | ||
+ | dnf install postgresql postgresql-server | ||
+ | </ | ||
+ | |||
+ | pgadmin4 | ||
+ | < | ||
+ | dnf install https:// | ||
+ | dnf install -y https:// | ||
+ | dnf install pgadmin4 | ||
+ | </ | ||
+ | |||
+ | |||
+ | Need to set up correct password authetication in pg_hba.conf and postgresql.conf | ||
influxdb | influxdb | ||
Line 217: | Line 304: | ||
systemctl start influxdb | systemctl start influxdb | ||
</ | </ | ||
+ | |||
influxdb-client | influxdb-client | ||
Line 230: | Line 318: | ||
| | ||
</ | </ | ||
+ | |||
+ | For fresh install from backup | ||
+ | < | ||
+ | rm -rf / | ||
+ | systemctl restart influxdb | ||
+ | influxdb setup --token ADMIN_TOKEN | ||
+ | influx restore / | ||
+ | </ | ||
+ | |||
+ | |||
grafana | grafana | ||
- | dnf install | + | < |
+ | wget -q -O gpg.key https:// | ||
+ | sudo rpm --import gpg.key | ||
+ | wget -q -O gpg.key https:// | ||
+ | rpm --import gpg.key | ||
+ | vi / | ||
+ | dnf install | ||
+ | systemctl daemon-reload | ||
+ | systemctl start grafana-server | ||
+ | </ | ||
+ | |||
+ | Copy existing grafana.db | ||
software.1736128545.txt.gz · Last modified: 2025/01/06 01:55 by holtz