This shows you the differences between two versions of the page.
Both sides previous revision Previous revision | Next revision Both sides next revision | ||
biac:pipeline [2014/08/04 17:53] cmp12_sa [Processing Daemon History] |
biac:pipeline [2019/05/07 14:54] cmp12 |
||
---|---|---|---|
Line 5: | Line 5: | ||
BIAC's Processing Pipeline is a software system designed to automatically transfer data off of the scanner, run a series of automated processes on the dataset, and transfer the processed results to a directory where the user can get to it. This process depends on proper assignation of the dataset to a valid Experiment. Once assigned, data is processed according to Experiment based preferences, | BIAC's Processing Pipeline is a software system designed to automatically transfer data off of the scanner, run a series of automated processes on the dataset, and transfer the processed results to a directory where the user can get to it. This process depends on proper assignation of the dataset to a valid Experiment. Once assigned, data is processed according to Experiment based preferences, | ||
- | ===== Transfer Daemon | + | ===== AIMS Overview ===== |
- | BIAC's transd software | + | Automated Imaging Management System ( AIMS ) is a multi-threaded python daemon that runs on a desktop connected to the internal |
- | ===== Transfer Daemon 1.0 Flowchart ===== | + | {{ : |
- | {{: | ||
- | ===== Processing Daemon 2.0 Beta Flowchart ===== | ||
- | |||
- | Note: This primarily describes Functional Data Flow once the data has been transferred off of the scanner and onto a valid processing server. Anatomical Data Flow is similar, only the set of processes applied to the input data differ. | ||
- | |||
- | {{: | ||
===== Backup Daemon Flowchart & Backup Admin Workflow ===== | ===== Backup Daemon Flowchart & Backup Admin Workflow ===== |