Latest from IBM Developer : Create a proactive AWS healthcare management system

Summary

In the healthcare domain, there is a lot of real-time data that is generated. This data needs to be monitored to generate predictions and alerts for healthcare professionals. Manual monitoring of such data is difficult. In this code pattern, we add AI-based predictions and automate the monitoring of healthcare data. To demonstrate IBM Cloud Pak for Data technology on AWS Cloud, we have taken the use case of predicting cardiac events based on real-time monitoring of patients’ health data.

Description

In this developer code pattern, you will learn to build a machine learning model with no code on IBM Cloud Pak for Data, create a streaming flow on Amazon Web Services (AWS) Cloud, and invoke the model to get predictions in real time.

The following IBM services are used in this code pattern:

IBM Cloud Pak for Data
Watson Studio
IBM SPSS Modeler (guided machine learning)
Watson machine learning

AWS services used in the code pattern:

AWS IAM roles
Amazon Kinesis
AWS Lambda functions
Amazon CloudWatch
Amazon S3

Once you complete the code pattern, you will know how to:

Create an AWS S3 bucket
Create an event notification for the S3 bucket to trigger functions for adding data to the bucket
Create IAM roles to AWS
Create a Lambda producer function to encrypt the data from the S3 bucket and send to Amazon Kinesis
Create a machine learning model using IBM SPSS Modeler on IBM Cloud Pak for Data
Deploy the machine learning model on IBM Cloud Pak for Data and get the APIs to invoke the model
Create a Lambda consumer function to decrypt the streaming data from Amazon Kinesis and send it to the model to get predictions
View the real-time predictions from IBM Cloud Pak for Data in Amazon CloudWatch

Related work from others:  Making machine learning more useful to high-stakes decision makers

Everyone using AWS will be able to seamlessly plug in the IBM Cloud Pak for Data Watson machine learning model to their flow.

Flow

Healthcare data is dumped into an S3 bucket on AWS.
A producer Lambda function is triggered to encrypt the data and stream it to AWS Kinesis.
A machine learning model is trained in Watson Studio on IBM Cloud Pak for Data using IBM SPSS Modeler, and the model is deployed in Watson Studio.
A consumer Lambda function reads the data from Amazon Kinesis streams.
The consumer function invokes the model from Watson Studio with the data received from Amazon Kinesis streams.
The data streamed from Amazon Kinesis, along with the predictions received from Watson Studio, are then visualized in AWS CloudWatch.

Instructions

Ready to give it a try? Get detailed instructions in the README file. Those instructions explain how to:

Create an S3 bucket
Create an Amazon Kinesis stream
Create an IAM role to AWS
Create producer Lambda function
Create an event notification for the S3 bucket
Build and deploy a Watson machine learning model, deploy the model in Watson, and copy the Watson machine learning token
Create consumer Lambda function
Upload data to S3 bucket
View logs in CloudWatch

Similar Posts