User Tools

Site Tools


biac:pipeline

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision Both sides next revision
biac:pipeline [2014/08/04 17:53]
cmp12_sa [Processing Daemon History]
biac:pipeline [2019/05/07 14:54]
cmp12
Line 5: Line 5:
 BIAC's Processing Pipeline is a software system designed to automatically transfer data off of the scanner, run a series of automated processes on the dataset, and transfer the processed results to a directory where the user can get to it. This process depends on proper assignation of the dataset to a valid Experiment. Once assigned, data is processed according to Experiment based preferences, and the results are transferred to the appropriate Experiment directory, where users who have data access privileges can get to it.  BIAC's Processing Pipeline is a software system designed to automatically transfer data off of the scanner, run a series of automated processes on the dataset, and transfer the processed results to a directory where the user can get to it. This process depends on proper assignation of the dataset to a valid Experiment. Once assigned, data is processed according to Experiment based preferences, and the results are transferred to the appropriate Experiment directory, where users who have data access privileges can get to it. 
  
-===== Transfer Daemon Overview =====+===== AIMS Overview =====
  
-BIAC's transd software is designed to run on GE Signa scanner, monitoring directories for k-space data and transferring raw data files off of the scanner via NFS mounts and/or SCP connections.+Automated Imaging Management System ( AIMS ) is a multi-threaded python daemon that runs on a desktop connected to the internal GE switch.  AIMS connects to the scanner console with sshfs to monitor the hard-drive for raw k-space data and adds new files to a copy queue thread to remove them from the scanner to reconstruct and archive.  AIMS launches as dicom receiver to receive automated dicom pushes from the console.  Once received there is a thread monitoring the receive location for new data to schedule processing and archive.  During the processing of reconstructed or dicom data images are wrapped with with BXH headers, converted to nii.gz, and copied to individual experiment storage locations.  If appropriate a full BIRN QA is run for functional data, which is uploaded to our webserver as well as copied to the experiment storage location.  GE physio files are also transferred if they were collected.  Imaging metadata and QA numbers are added to our imaging database for use with various web services.
  
-===== Transfer Daemon 1.0 Flowchart =====+{{ :biac:aims_flowchart.png?600 |}}
  
-{{:biac:transd-1.0-flowchart-1.png|Transfer Daemon 1.0 Flowchart}} 
  
-===== Processing Daemon 2.0 Beta Flowchart ===== 
- 
-Note: This primarily describes Functional Data Flow once the data has been transferred off of the scanner and onto a valid processing server. Anatomical Data Flow is similar, only the set of processes applied to the input data differ.  
- 
-{{:biac:procd-2.0-flowchart-1.png|Processing Daemon 2.0 Beta Flowchart}} 
  
 ===== Backup Daemon Flowchart & Backup Admin Workflow ===== ===== Backup Daemon Flowchart & Backup Admin Workflow =====
biac/pipeline.txt · Last modified: 2023/10/23 14:24 by cmp12