How NIH is Using Big Data to Battle Cancer

How NIH is Using Big Data to Battle Cancer

In our previous article on the ARServices blog, we shared part one of a two-part podcast from Federal Technology Insider (FTI) that featured the CEO of ARServices, Jay McCargo. The podcast highlighted how Big Data solutions are being applied to federal programs at the National Institutes of Health (NIH) and the National Cancer Institute, and how Big Data and data analytics are helping to save and change lives.

In the second half of his conversation with FTI, Mr. McCargo took an even deeper dive into the program jay-1016that the ARServices team supported at NIH. He discussed the sheer volume of data the team manages, which – although massive – is crucial to making an impact for the cause.

The team at ARServices collects and archives the many documents that support clinical trials for cancer treatments. While these documents are simply part of the process and not particularly remarkable on their own, a comprehensive archiving system enables the National Cancer Institute to analyze data more quickly and produce better results for their patients.

“Essentially, we are looking at thousands of records and instances. We analyze them, we aggregate them, and we present the aggregate of that finding for analysis to NIH scientists,” said Mr. McCargo. This aggregated data enables the NIH to better evaluate the effectiveness of drugs that are being used in clinical trials in an effort to treat cancer.

Ultimately, this data-driven effort delivers a thorough and streamlined experience with regard to clinical trial development and approval timelines.

To see the original article and to listen to the podcast on Federal Technology Insider, click HERE.