The Research PACS on AWS solution facilitates researchers' access medical images stored in the clinical PACS in a secure and seamless manner

Overview

Research PACS on AWS

Challenge to solve

The rise of new technologies in medical imaging, such as artificial intelligence, is an great opportunity to accelerate research and facilitate clinical improvements. This requires access to innovative services and computational capabilities, that the cloud can provide, as well as imaging and clinical data to support research studies.

Historically, it has been difficult for researchers to access medical images collected during the course of clinical care. This sometimes consists of a one-time, manual process where a clinical PACS operator exports and shares images with a research team on a USB stick or a shared file server. Personal health information may persist in exported files, posing a privacy or compliance risk. Different research teams may also need access to different datasets, which requires to govern and enforce access rights to medical images.

Solution presentation

This solution aims to provide an answer to this challenge. It facilitates researchers to access medical images stored in the clinical PACS in a secure and seamless manner, after potentially identifying information is removeed, and allows medical images to be exported to Amazon S3 in order to leverage cloud capabilities for processing.

Key features

  • End-to-end solution that can be fully deployed using AWS CloudFormation without additional coding

  • Self-service portal accessible by researchers to browse, query and export DICOM files to Amazon S3

  • Integration with the clinical PACS for image ingestion using any of the protocols supported by Orthanc (DICOM C-STORE, DICOMweb, Orthanc API, etc.)

  • De-identification of original DICOM files based on transformation rules that you define in a YAML configuration file.

    • Possible transformations include shifting dates and times by a random number of days or seconds, replacing the value of UI data elements by randomly generated UUID, replacing data element values by random strings, adding or deleting data elements, transcoding DICOM files, and removing burned-in pixel annotations using manual coordinates or Amazon Rekognition
    • Storage of mappings between original values and replaced values, in order to maintain consistency of data element values across multiple DICOM instances (e.g. StudyDate should be replaced by the same value in two DICOM instances that belong to the same series)
  • Storage of de-identified DICOM files in Amazon S3, a performant, scalable and cost-effective object storage service

  • Multiple security features

    • Authentication using Amazon Cognito
    • Granular permissions to give research teams access to only a subset of DICOM files
    • Tracability of user activity in the self-service portal

Screenshots

Click Screenshots to browse a selection of screenshots that illustrate the main features of the Research PACS on AWS solution.

Simplified architecture

Simplified architecture

  1. Your clinical PACS or your modalities are configured to automatically forward original DICOM files to the first Orthanc server, using one of the protocols supported by Orthanc (DICOM C-STORE, DICOMweb, Orthanc APIs).

  2. The solution automatically detects and modifies new DICOM files to remove potentially identifying information about a patient, based on transformation rules that you define, and sends the de-identified DICOM file to the second Orthanc server.

  3. Researchers use a self-service portal to browse, search and export de-identified DICOM files to Amazon S3. Researchers must authenticate to access the portal, and user activity is logged for auditability purposes. The solution also allows you to restrict access to a subset of DICOM files only, based on permission profiles that you attach to users or groups.

The solution consists primarily of Python scripts that run in Docker containers, that leverage notably Flask for the self-service portal and pydicom to manipulate DICOM files.

It also uses Orthanc, a popular open-source PACS server, and its Cloud Object Storage plugin for Amazon S3 to store and process DICOM files. For example, exporting a frame to a PNG file is done by calling an Orthanc API. Moreover, the solution can provide researchers with access to the underlying Orthanc server - the second Orthanc server - to take advantage of additional capabilities that may be offered by Orthanc Explorer or Orthanc plugins.

Deploy the solution

Deploy all components on AWS

This is the primary and easiest deployment pattern. Follow these instructions to provision the required resources using AWS CloudFormation, and configure the solution.

Other deployment patterns

The first Orthanc server and the de-identifier can be deployed either on AWS, or on premises if you choose to send already de-identified data to the cloud. Moreover, if you already have an on-premises system to de-identify medical images, you can choose to not deploy the first Orthanc server and the de-identifier, and forward the medical images directly to the second Orthanc server.

Follow these instructions to learn more about these other deployment patterns, and to get deployment instructions.

Further reading

Releases

License

This solution is licensed under the MIT-0 License. See the LICENSE file.

This solution depends on and may incorporate or retrieve a number of third-party software packages (such as open source packages) at install-time or build-time or run-time ("External Dependencies"). The External Dependencies are subject to license terms that you must accept in order to use this solution. If you do not accept all of the applicable license terms, you should not use this solution. We recommend that you consult your company’s open source approval policy before proceeding.

Provided below is a list of External Dependencies and the applicable license identification as indicated by the documentation associated with the External Dependencies as of Amazon's most recent review.

THIS INFORMATION IS PROVIDED FOR CONVENIENCE ONLY. AMAZON DOES NOT PROMISE THAT THE LIST OR THE APPLICABLE TERMS AND CONDITIONS ARE COMPLETE, ACCURATE, OR UP-TO-DATE, AND AMAZON WILL HAVE NO LIABILITY FOR ANY INACCURACIES. YOU SHOULD CONSULT THE DOWNLOAD SITES FOR THE EXTERNAL DEPENDENCIES FOR THE MOST COMPLETE AND UP-TO-DATE LICENSING INFORMATION.

YOUR USE OF THE EXTERNAL DEPENDENCIES IS AT YOUR SOLE RISK. IN NO EVENT WILL AMAZON BE LIABLE FOR ANY DAMAGES, INCLUDING WITHOUT LIMITATION ANY DIRECT, INDIRECT, CONSEQUENTIAL, SPECIAL, INCIDENTAL, OR PUNITIVE DAMAGES (INCLUDING FOR ANY LOSS OF GOODWILL, BUSINESS INTERRUPTION, LOST PROFITS OR DATA, OR COMPUTER FAILURE OR MALFUNCTION) ARISING FROM OR RELATING TO THE EXTERNAL DEPENDENCIES, HOWEVER CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, EVEN IF AMAZON HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. THESE LIMITATIONS AND DISCLAIMERS APPLY EXCEPT TO THE EXTENT PROHIBITED BY APPLICABLE LAW.

You might also like...
Opensea-upload-with-recaptcha-solution - Updated opensea uploading solution with recaptcha pass

opensea-upload-with-recaptcha-solution updated opensea uploading solution with r

A solution designed to extract, transform and load Chicago crime data from an RDS instance to other services in AWS.

This project is intended to implement a solution designed to extract, transform and load Chicago crime data from an RDS instance to other services in AWS.

Project template for using aws-cdk, Chalice and React in concert, including RDS Postgresql and AWS Cognito

What is This? This repository is an opinonated project template for using aws-cdk, Chalice and React in concert. Where aws-cdk and Chalice are in Pyth

This solution helps you deploy Data Lake Infrastructure on AWS using CDK Pipelines.
This solution helps you deploy Data Lake Infrastructure on AWS using CDK Pipelines.

CDK Pipelines for Data Lake Infrastructure Deployment This solution helps you deploy data lake infrastructure on AWS using CDK Pipelines. This is base

AWS Auto Inventory allows you to quickly and easily generate inventory reports of your AWS resources.
AWS Auto Inventory allows you to quickly and easily generate inventory reports of your AWS resources.

Photo by Denny Müller on Unsplash AWS Automated Inventory ( aws-auto-inventory ) Automates creation of detailed inventories from AWS resources. Table

A suite of utilities for AWS Lambda Functions that makes tracing with AWS X-Ray, structured logging and creating custom metrics asynchronously easier

A suite of utilities for AWS Lambda Functions that makes tracing with AWS X-Ray, structured logging and creating custom metrics asynchronously easier

Unauthenticated enumeration of services, roles, and users in an AWS account or in every AWS account in existence.

Quiet Riot 🎶 C'mon, Feel The Noise 🎶 An enumeration tool for scalable, unauthenticated validation of AWS principals; including AWS Acccount IDs, roo

A telegram bot providing recon and research functions for bug bounty research
A telegram bot providing recon and research functions for bug bounty research

Bug Bounty Bot A telegram bot with commands to simplify bug bounty tasks Installation Use Road Map Installation BugBountyBot is open-source so you can

Terraform module to ship CloudTrail logs stored in a S3 bucket into a Kinesis stream for further processing and real-time analysis.
Terraform module to ship CloudTrail logs stored in a S3 bucket into a Kinesis stream for further processing and real-time analysis.

AWS infrastructure to ship CloudTrail logs from S3 to Kinesis This repository contains a Terraform module to ship CloudTrail logs stored in a S3 bucke

Owner
AWS Samples
AWS Samples
SSH-Restricted deploys an SSH compliance rule (AWS Config) with auto-remediation via AWS Lambda if SSH access is public.

SSH-Restricted SSH-Restricted deploys an SSH compliance rule with auto-remediation via AWS Lambda if SSH access is public. SSH-Auto-Restricted checks

Adrian Hornsby 30 Nov 8, 2022
Automated AWS account hardening with AWS Control Tower and AWS Step Functions

Automate activities in Control Tower provisioned AWS accounts Table of contents Introduction Architecture Prerequisites Tools and services Usage Clean

AWS Samples 20 Dec 7, 2022
Implement backup and recovery with AWS Backup across your AWS Organizations using a CI/CD pipeline (AWS CodePipeline).

Backup and Recovery with AWS Backup This repository provides you with a management and deployment solution for implementing Backup and Recovery with A

AWS Samples 8 Nov 22, 2022
AWS Blog post code for running feature-extraction on images using AWS Batch and Cloud Development Kit (CDK).

Batch processing with AWS Batch and CDK Welcome This repository demostrates provisioning the necessary infrastructure for running a job on AWS Batch u

AWS Samples 7 Oct 18, 2022
A python package that fetches tweets and user information in a very pythonic manner.

Tweetsy Tweetsy uses Twitter's underlying API to fetch user information and tweets and present it in a human-friendly way. What makes Tweetsy special

Sakirul Alam 5 Nov 12, 2022
Access Undenied parses AWS AccessDenied CloudTrail events, explains the reasons for them, and offers actionable remediation steps. Open-sourced by Ermetic.

Access Undenied on AWS Access Undenied parses AWS AccessDenied CloudTrail events, explains the reasons for them, and offers actionable fixes. Access U

Ermetic 204 Jan 2, 2023
Validate all your Customer IAM Policies against AWS Access Analyzer - Policy Validation

✅ Access Analyzer - Batch Policy Validator This script will analyze using AWS Access Analyzer - Policy Validation all your account customer managed IA

Victor GRENU 41 Dec 12, 2022
Check AWS S3 instances for read/write/delete access

s3sec Test AWS S3 buckets for read/write/delete access This tool was developed to quickly test a list of s3 buckets for public read, write and delete

0xmoot 114 Dec 6, 2022