Bunkerhill Health
  • Introduction
  • Software
    • APIs
      • Inference API
      • DICOM C-STORE API
    • SDKs
      • Client SDK
        • Client SDK: Installation
        • Client SDK: Usage
          • Inference
          • Study
Powered by GitBook
On this page
  • Quickstart
  • InferenceAPIClient Reference
  1. Software
  2. SDKs
  3. Client SDK
  4. Client SDK: Usage

Inference

Usage instructions for the Client SDK for Inference

Quickstart

To read all inferences for a patient with MRN $PATIENT_MRN and model ID $MODEL_ID with username $USERNAME and private key filename $PRIVATE_KEY_FILENAME, run:

from client_sdk import InferenceAPIClient

async with InferenceAPIClient(
    username="$USERNAME",
    private_key_filename="$PRIVATE_KEY_FILENAME",
) as client:
    inferences = await client.get_inferences(
        model_id="$MODEL_ID",
        patient_mrn="$PATIENT_MRN",
    )
import { InferenceAPIClient } from 'bunkerhill-inference-api/client';

const client = new InferenceAPIClient('$USERNAME', '$PRIVATE_KEY_FILENAME');

const inferences = await client.getInferences(
  '$MODEL_ID',
  '$PATIENT_MRN',
);

InferenceAPIClient Reference

Constructor

Method signature

def __init__(
    self,
    username: str,
    private_key_filename: Optional[str] = None,
    private_key_string: Optional[str] = None,
    base_url: str = 'https://api.bunkerhillhealth.com/',
) -> None:
    ...

Parameters

  • username (str): The username to authenticate the client.

  • private_key_filename (Optional[str]): Filename of the RSA private key.

  • private_key_string (Optional[str]): The RSA private key as a string.

  • base_url (str, has a default): The base URL of the Inference API. Defaults to 'https://api.bunkerhillhealth.com/'.

Notes

  • At least one of private_key_filename or private_key_string must be provided.


get_inferences

Gets a list of Inference objects for a given patient and a given model from the Inference API. Must be called from an async context.

Hint: The get_inferences method is asynchronous. To make use of this method from a synchronous application, calls to the InferenceAPIClient can be wrapped in asyncio.run().

Method signature

def get_inferences(
    self,
    model_id: str,
    patient_mrn: str,
) -> List[Inference]:
    ...

Parameters

  • model_id (str): The model ID of the model.

  • patient_mrn (str): The medical record number (MRN) of the patient.

Returns

Notes

  • You must have authorization to the specified model_id. If not, a 403 error will be raised.

  • Only inferences corresponding to institutions that you are authorized to will be returned.

Constructor

Method signature

constructor(
  username: string,
  privateKeyFilename?: string,
  privateKeyString?: string,
  baseUrl: string = 'https://api.bunkerhillhealth.com/',
) {}

Parameters

  • username (string): The username to authenticate the client.

  • privateKeyFilename (string, optional): Filename of the RSA private key.

  • privateKeyString (string, optional): The RSA private key as a string.

  • baseUrl (string, has a default): The base URL of the Inference API. Defaults to 'https://api.bunkerhillhealth.com/'.

Notes

  • At least one of privateKeyFilename or privateKeyString must be provided.


getInferences

Fetches a list of Inference objects for a given patient and a given model from the Inference API. Must be called from an async context.

Method signature

async getInferences(
  modelId: string,
  patientMrn: string,
): Promise<Inference[]> {}

Parameters

  • modelId (str): The model ID of the model.

  • patientMrn (str): The medical record number (MRN) of the patient.

Returns

Notes

  • You must have authorization to the specified modelId. If not, a 403 error will be raised.

  • Only inferences corresponding to institutions that you are authorized to will be returned.

PreviousClient SDK: UsageNextStudy

Last updated 1 year ago

List[Inference]: A list of JSON Dicts, one for each inference. See the for more details on the format of these Dicts.

Promise<Inference[]>: A list of JSON Dicts, one for each inference. See for more details on the format of these Dicts.

Inference API documentation
Inference JSON Format