Data Science and Engineering Intern
Employment Type:
Intern
Education Preference:
Graduate program with course-work in probabilistic/stochastic processes, ML methods, and biomedical concentration.
Description:
Asher Informatics is developing several software-enabled, professional service offerings to be sold to US Healthcare Organizations requiring help in developing, activating, and sustaining a clinical AI strategy. We define “clinical AI” to encompass software technologies – usually will be FDA cleared Software as A Medical Device – for use in assisting licensed, healthcare practitioners in patient management activities. The inputs for these clinical AI models are medical images (radiology, cardiology, pathology, nuclear medicine, endoscopy and other video) signal waveforms from medical device systems (cardiac monitors, anesthesia benches, wearables, neurologic physiologic monitors) and often textual or numerical data in context for or with the patient. Our software tools will be used in 3, broad scenarios:
-
Helping an HCO develop and justify a clinical AI strategy and solution roadmap to realize it
-
Help an HCO in the selection and evaluation of marketed, clinical AI solutions including the oversight of performance using retrospective and/or prospective data from the HCO’s practice
-
Provide ongoing monitoring and surveillance of the HCO’s deployed, clinical AI above and beyond what individual vendors responsibilities for their products. Functionality includesdemonstrating outcomes and if the targets or assumptions from the AI strategy are being met.
Responsabilities:
The data-scientist intern will
-
Assist in statistical analyses and computations for research grant proposals.
-
Investigate and prototype LLMs enriched with domain-specific documents and feedback.
-
Exploration and recommendation of image noise and artifact quantification
-
Pipeline development with various, SoA explainability AI methods application to medical imaging studies
-
Assistance or lead data analysis from hospital clients – datatypes include descriptive and tabular statistics and quality metrics.
-
Develop and maintain data pipeline code.
-
Execution of batched experiments to validate customized solutions.