IMAS commands used in this process
idsdump - to print all structure of IDS onto terminal
idsdumpath - to print specific node of IDS onto terminal, for example: node ids_properties/version_put
> idsdumppath $USER f4f 3 1106 2025 summary ids_properties/version_put Type: <class 'imas.summary.ids_properties_version_put__structure'> ---------------------------------------------- ---------------------------------------------- class version_put Attribute data_dictionary: 3.39.0 Attribute access_layer: 5.0.0 Attribute access_layer_language: python
idscp - to copy one IDS into another
Problematic situation
I had this file structure of test data:
/home/ITER/filipca/public/imasdb/f4f/ └── 3 └── 1106 ├── 2020 │ ├── ids_001.characteristics │ ├── ids_001.datafile │ └── ids_001.tree └── 2021 ├── ids_001.characteristics ├── ids_001.datafile └── ids_001.tree 4 directories, 6 files
but it turns out that it was build with AL(4.7.2) and DD (3.28.1) version:
> idsdumppath $USER f4f 3 1106 2020 summary ids_properties Type: <class 'imas.summary.ids_properties__structure'> ---------------------------------------------- ---------------------------------------------- class ids_properties Attribute comment: 0123456789abcdef Attribute homogeneous_time: 1 Attribute source: 0123456789abcdef Attribute provider: 0123456789abcdef Attribute creation_date: 0123456789abcdef Attribute version_put class version_put Attribute data_dictionary: 3.28.1 Attribute access_layer: 4.7.2 Attribute access_layer_language: fortran
and when i try to create new IDS and copy some data there i got this error:
ERROR:root:b"al_plugin_begin_global_action: [ALLowlevelException = Compatibility between opened file version 1.0 and backend MDSPLUS_BACKEND version 1.1 can't be ensured (minor versions should match when writing). ABORT.\n]" ERROR:root:b"al_plugin_begin_global_action: [ALLowlevelException = Compatibility between opened file version 1.0 and backend MDSPLUS_BACKEND version 1.1 can't be ensured (minor versions should match when writing). ABORT.\n]"
Sooo, first I had to create new IDS summary with AL5 and copy data there.
There is a tool idscp that does that for me:
> idscp --help usage: idscp [-h] [-u USER] [--database DATABASE] [--backend BACKEND] [--version VERSION] -si SHOT_INPUT -ri RUN_INPUT -so SHOT_OUTPUT -ro RUN_OUTPUT [-do DATABASE_OUTPUT] [-bo BACKEND_OUTPUT] [-f] [--setDatasetVersion] [-i] [-a | -o OUTPUTOCCURRENCE] [ids [ids ...]] Copy IDSs from a data-entry into another one
so I have created two another IDs using this command:
> idscp --database f4f -si 1106 -ri 2021 -so 1106 -ro 2024 -do f4f -bo MDSPLUS --setDatasetVersion -i -a > idscp --database f4f -si 1106 -ri 2021 -so 1106 -ro 2025 -do f4f -bo MDSPLUS --setDatasetVersion -i -a
and the final structure of my file looks like this:
> tree f4f f4f └── 3 └── 1106 ├── 2020 │ ├── ids_001.characteristics │ ├── ids_001.datafile │ └── ids_001.tree ├── 2021 │ ├── ids_001.characteristics │ ├── ids_001.datafile │ └── ids_001.tree ├── 2024 │ ├── ids_001.characteristics │ ├── ids_001.datafile │ └── ids_001.tree └── 2025 ├── ids_001.characteristics ├── ids_001.datafile └── ids_001.tree 6 directories, 12 files
so right now my 2024 & 2025 are AL5 compatible
> idsdumppath $USER f4f 3 1106 2025 summary ids_properties Type: <class 'imas.summary.ids_properties__structure'> ---------------------------------------------- ---------------------------------------------- class ids_properties Attribute comment: 0123456789abcdef Attribute homogeneous_time: 1 Attribute source: 0123456789abcdef Attribute provider: 0123456789abcdef Attribute creation_date: 0123456789abcdef Attribute version_put class version_put Attribute data_dictionary: 3.39.0 Attribute access_layer: 5.0.0 Attribute access_layer_language: python
This DBEntry contains only 'summary' IDS, which we can 'open()' - it already exists.
But if we want to add to the same DBEntry some other IDSs - we have to create them from scratch using 'constructor'
https://sharepoint.iter.org/departments/POP/CM/IMDesign/Code%20Documentation/ACCESS-LAYER-doc/python/5.2/use_ids.html#create-an-empty-id
In this example we are creating 'dataset_description' IDS
So this is my script:
import argparse import imas from imas import imasdef from imas.imasdef import MDSPLUS_BACKEND import numpy as np def common_data_processing(ids_summary_2024, ids_summary_2025, ids_dataset_description_2024, ids_dataset_description_2025, entry_2024, entry_2025): ''' ╔══════════════════════════════════════════════════════════════════════════════╗ ║ data extraction ║ ╚══════════════════════════════════════════════════════════════════════════════╝ ''' summary_time_2024 = ids_summary_2024.time summary_comment_2024 = ids_summary_2024.ids_properties.comment summary_al_version_2024 = ids_summary_2024.ids_properties.version_put.access_layer summary_ip_value_2024 = ids_summary_2024.global_quantities.ip.value print(f"\n ------ summary -----\n") print(f'=== IDS SUMMARY TIME: {ids_summary_2024.time}') print(f'=== IDS SUMMARY COMMENT: {summary_comment_2024}') print(f'=== IDS SUMMARY AL PUT VERSION: {summary_al_version_2024}') print(f'=== IDS SUMMARY global_quantities/ip array:\n {summary_ip_value_2024}') summary_time_2025 = ids_summary_2025.time summary_comment_2025 = ids_summary_2025.ids_properties.comment summary_al_version_2025 = ids_summary_2025.ids_properties.version_put.access_layer summary_ip_value_2025 = ids_summary_2025.global_quantities.ip.value print("\n") print(f'=== IDS SUMMARY TIME: {ids_summary_2025.time}') print(f'=== IDS SUMMARY COMMENT: {summary_comment_2025}') print(f'=== IDS SUMMARY AL PUT VERSION: {summary_al_version_2025}') print(f'=== IDS SUMMARY global_quantities/ip array:\n {summary_ip_value_2025}') ''' ╔══════════════════════════════════════════════════════════════════════════════╗ ║ data ingestion ║ ╚══════════════════════════════════════════════════════════════════════════════╝ ''' ids_dataset_description_2024.data_entry.user = 'g2afilip' ids_dataset_description_2024.data_entry.machine = 'iter' ids_dataset_description_2024.data_entry.pulse_type = 'simulation' ids_dataset_description_2024.data_entry.pulse = 1106 ids_dataset_description_2024.data_entry.run = 2024 ids_dataset_description_2024.ids_properties.homogeneous_time = imas.imasdef.IDS_TIME_MODE_HOMOGENEOUS ids_dataset_description_2024.time = ids_summary_2024.time ids_dataset_description_2025.data_entry.user = 'g2kniznik' ids_dataset_description_2025.data_entry.machine = 'west' ids_dataset_description_2025.data_entry.pulse_type = 'pulse' ids_dataset_description_2025.data_entry.pulse = 1106 ids_dataset_description_2025.data_entry.run = 2025 ids_dataset_description_2025.ids_properties.homogeneous_time = imas.imasdef.IDS_TIME_MODE_HOMOGENEOUS ids_dataset_description_2025.time = ids_summary_2025.time ''' ╔══════════════════════════════════════════════════════════════════════════════╗ ║ data extraction ║ ╚══════════════════════════════════════════════════════════════════════════════╝ ''' print(f"\n ------ dataset_description -----\n") print(f"=== IDS dataset_description USER: {ids_dataset_description_2024.data_entry.user}") print(f"=== IDS dataset_description MACHINE: {ids_dataset_description_2024.data_entry.machine}") print(f"=== IDS dataset_description PULSE_TYPE: {ids_dataset_description_2024.data_entry.pulse_type}") print(f"=== IDS dataset_description PULSE: {ids_dataset_description_2024.data_entry.pulse}") print(f"=== IDS dataset_description RUN: {ids_dataset_description_2024.data_entry.run}") print(f"=== IDS dataset_description HOMOGENEOUS_TIME: {ids_dataset_description_2024.ids_properties.homogeneous_time}") print(f"=== IDS dataset_description TIME: {ids_dataset_description_2024.time}") print("\n") print(f"=== IDS dataset_description USER: {ids_dataset_description_2025.data_entry.user}") print(f"=== IDS dataset_description MACHINE: {ids_dataset_description_2025.data_entry.machine}") print(f"=== IDS dataset_description PULSE_TYPE: {ids_dataset_description_2025.data_entry.pulse_type}") print(f"=== IDS dataset_description PULSE: {ids_dataset_description_2025.data_entry.pulse}") print(f"=== IDS dataset_description RUN: {ids_dataset_description_2025.data_entry.run}") print(f"=== IDS dataset_description HOMOGENEOUS_TIME: {ids_dataset_description_2025.ids_properties.homogeneous_time}") print(f"=== IDS dataset_description TIME: {ids_dataset_description_2025.time}") ''' ╔══════════════════════════════════════════════════════════════════════════════╗ ║ saving data entry ║ ╚══════════════════════════════════════════════════════════════════════════════╝ ''' entry_2024.put(ids_dataset_description_2024) entry_2025.put(ids_dataset_description_2025) print(f"\nIDS 'data_description' for 2024 and 2025 are saved successfully!") ''' ╔══════════════════════════════════════════════════════════════════════════════╗ ║ closing data entry ║ ╚══════════════════════════════════════════════════════════════════════════════╝ ''' entry_2024.close() entry_2025.close() print("\nDBEntries for shots 2024 and 2025 are closed successfully!") def al5_process(): np.set_printoptions(threshold=10) uri_2024 = "imas:mdsplus?path=/home/ITER/filipca/public/imasdb/f4f/3/1106/2024" uri_2025 = "imas:mdsplus?path=/home/ITER/filipca/public/imasdb/f4f/3/1106/2025" entry_2024 = imas.DBEntry(uri_2024, "r") ids_summary_2024 = entry_2024.get('summary') ids_dataset_description_2024 = imas.dataset_description() entry_2025 = imas.DBEntry(uri_2025, "r") ids_summary_2025 = entry_2025.get('summary') ids_dataset_description_2025 = imas.dataset_description() # Call common processing logic common_data_processing(ids_summary_2024, ids_summary_2025, ids_dataset_description_2024, ids_dataset_description_2025, entry_2024, entry_2025) print("\nAL5 Process Complete.\n") def al4_process(): np.set_printoptions(threshold=10) user = '/home/ITER/filipca/public/imasdb' db_name = 'f4f' shot = 1106 run = 2024 backend_id = MDSPLUS_BACKEND entry_2024 = imas.DBEntry(backend_id, db_name, shot, run, user) entry_2024.open() ids_summary_2024 = entry_2024.get('summary') ids_dataset_description_2024 = imas.dataset_description() entry_2025 = imas.DBEntry(backend_id, db_name, shot, run+1, user) entry_2025.open() ids_summary_2025 = entry_2025.get('summary') ids_dataset_description_2025 = imas.dataset_description() # Call common processing logic common_data_processing(ids_summary_2024, ids_summary_2025, ids_dataset_description_2024, ids_dataset_description_2025, entry_2024, entry_2025) print("\nAL4 Process Complete.\n") def main(al_version): if al_version == 5: al5_process() elif al_version == 4: al4_process() else: print("Unsupported AL version specified.") if __name__ == '__main__': parser = argparse.ArgumentParser(description="Process data with AL version.") parser.add_argument("--al", type=int, choices=[4, 5], help="AL version to process data with (4 or 5).") args = parser.parse_args() main(args.al)
So after this script we have IN THE SAME DBENTRY two IDS:
- 2024
> idsdumppath $USER f4f 3 1106 2024 summary ids_properties/version_put Type: <class 'imas.summary.ids_properties_version_put__structure'> ---------------------------------------------- ---------------------------------------------- class version_put Attribute data_dictionary: 3.39.0 Attribute access_layer: 5.0.0 Attribute access_layer_language: python ============================================================================================ > idsdumppath $USER f4f 3 1106 2024 dataset_description data_entry Type: <class 'imas.dataset_description.data_entry__structure'> ---------------------------------------------- ---------------------------------------------- class data_entry Attribute user: g2afilip Attribute machine: iter Attribute pulse_type: simulation Attribute pulse: 1106 Attribute run: 2024
- 2025
> idsdumppath $USER f4f 3 1106 2025 summary ids_properties/version_put Type: <class 'imas.summary.ids_properties_version_put__structure'> ---------------------------------------------- ---------------------------------------------- class version_put Attribute data_dictionary: 3.39.0 Attribute access_layer: 5.0.0 Attribute access_layer_language: python ======================================================================================== > idsdumppath $USER f4f 3 1106 2025 dataset_description data_entry Type: <class 'imas.dataset_description.data_entry__structure'> ---------------------------------------------- ---------------------------------------------- class data_entry Attribute user: g2kniznik Attribute machine: west Attribute pulse_type: pulse Attribute pulse: 1106 Attribute run: 2025