AI in Pathology: How this startup aims to detect cancer at your local clinic #AIBOSS

adarsh-aindra-thumbnail

We, at NextBigWhat, are attempting to drive forward the conversation around Artificial Intelligence in India beyond headlines that scream for attention but don’t offer much, buzzwords that make your head buzz after a while, and countless utopian tales that vie for your eyes only to leave you with a scratching head and a lingering itch to actually get to the meat of the matter.

In this spirit, the #AIBoss series intends to educate, inform and elevate our readers understanding of the subject as well as the space. If you’re part of the leadership team of an AI based product, from enterprise or startup, willing to share your AI learnings – then reach out to us and become part of this initiative (you can apply using this form).


adarsh-aindra
Adarsh Natarajan, CEO & Founder, Aindra alongside the company’s computational pathology devices.

Adarsh Natarajan is the CEO & Founder of Aindra, a medtech company that offers AI based computational pathology solutions. Their first product, CervAstra, is aimed at automating the detection of cervical cancer, an illness which causes more deaths in India than in any other nation. Aindra says that their core computational pathology platform, Astra, has been designed to be extended to various other critical illnesses including, but not limited to, other types of cancers.

We caught up with Adarsh for a comprehensive tête-à-tête about Aindra, the kind of work they’re doing, how AI can help solve healthcare woes in India, the challenges that exist in AI diagnostics, and much more.

How would you introduce Aindra?

So, to put it a little technically, we are an AI-based computational pathology company which we leverage to provide point-of-care diagnosis for critical illnesses. In layman terms, we provide the ability to detect certain kinds of cancers right at ordinary physician’s clinics.

So, from scanning the history of Aindra, it seems that you pivoted from education-centric AI solutions to AI Diagnostics. So how did you arrive at the decision and what prompted the change?

Initially in the AI space we were looking at completely different use cases. We were going to do identity management in the education space, and then we pivoted away from the education space and started doing stuff for other sectors with broader positioning. We started looking at computer vision much more broadly – not just face recognition, which we started with – and I had a moment where I realized that it has tremendous application areas.

Technology like AI and Computer Vision have vast, multiple application areas. That’s when I started meeting people from across sectors and domains to talk to them and understand what is happening in their spaces. I met people from the automobile industry, manufacturing, textiles, and then also in healthcare – and healthcare struck a chord with me.

There were much more lucrative and easier routes in other segments, but what struck me was that we could democratize technologies like this and make it available for a large population to solve big problems in an economically viable manner. I felt passionate about that.

And so, we zeroed in on cervical cancer as the first condition to take on, while realizing simultaneously, that this would probably be our first beachhead and that the core technology would be leveraged to tackle other conditions going forward.

ai-healthcare

What were the initial challenges you encountered when you began building a solution for the clinical pathology space?

I come from a technology background, and we started with building only the AI software part as proof-of-concept and tried it out on small data sets. But there were absolutely no enabling systems available which would help the AI…

You mean integration to facilitate the transfer of data from slides to the system?

Yes, but there weren’t even any systems to capture the data first. You need to capture it, then convert it into a digital format and then apply the AI algorithms. So we built the AI algorithms and were quite gung-ho about it and then when we started talking to customers we realized that we hadn’t done a good job in assessing the limitations.

We wanted to provide the ability to detect critical cancer conditions not only in tertiary care oncology centers in cities like Bangalore, Mumbai, Chennai and Delhi but also make it available to primary physicians or clinics in faraway Tumkur, Bijapur, Ballari – tier three towns.

If that were to happen, we had to build the enabling systems too and backwards integrate them to create an end-to-end ecosystem. That meant building hardware which just wasn’t available commonly. The bitter truth hit home that India still imports 75-80 percent of its medical devices from outside and most aren’t built for point-of-care but for centralized large centers such as a Vedanta or a Fortis. So, we had to design everything from the ground up to take the bull by the horns in this space and truly create a point-of-care system.

pathology-ai

To build the ecosystem of the product, so to speak…

Yes. What started as one bespoke application morphed into a solution with multiple components all working in tandem to act as a single end-to-end product. Which meant we had to deal with designing hardware and getting them into manufacturing them and sending them into queue, so it is a whole different ball game than what we had expected.

Tell me a little about the deep learning systems you’ve implemented.

When we started off, deep learning was quite nascent in terms of its maturity, so we began with traditional machine learning techniques. We then started experimenting with deep learning because it has given us a quantum leap over some of the traditional techniques. But even today, it’s not one size fits all. We’ve really had to tinker, experiment and build a suite of algorithms to make it work. Today, we have a mix of deep learning and traditional machine learning to do the processing and then newer techniques like Deep Neural Networks, which brings its own strengths.

We also have to deal with the hard constraint of being in the medical world, where data is a real luxury. It’s not like we’re dealing with consumer data where you can crawl the web and scrape endless amounts of information to feed into your system.

So we’ve used some standard frameworks like TensorFlow and Caffe. We’ve also had to use transfer learning because of the insufficiency of data as I mentioned earlier, and build customized networks to deal with it.

adarsh-aindra2

On the clinical side, how did you go about partnerships and sourcing the data?

We had to partner with a very reputed tertiary oncology center called Kidwai, which is one of India’s best, go through an extremely long cycle of approvals before we got the final go-ahead. We then worked with their pathologists and used data from their archives. So we basically co-opted them into co-creating the product and the algorithm. They imparted their clinical expertise and we translated it into an effective AI. It was a long journey beginning with extracting medical data, drawing useful information from it and so on. Digitization of the data was a big and crucial part i.e turning archived glass slides into usable data format for training our system. We are now in the stage of clinically validating many of our solutions which will be done by mid-2019, hopefully.

And I’m guessing all of this data ultimately feeds to your computational pathology platform named Astra. What can you tell us about it?

With the computational pathology platform, what it allows us to do is extend our application area to tackle other conditions by retraining newer models with newer data by leveraging the same layer flow framework. We’ve been looking at other cancers such as blood cancers, leukemias and hope to move from cytology to even histopathology – which is the study of tissues – which means we would be going after prostate cancers and the like.

pathologist-in-lab

I was just about to ask you if Astra could potentially tackle detection of organ malignancies and other forms of cancers…

Yeah. I hope that answers your question somewhat. We’ve taken a lot of pain and effort to get to where we are today. And in hindsight, it was worth it because it has now prepared us to replicate it in an easier manner since our learning models can be fed with newer data to provide effective results. As an experiment, we’ve been able to prove this to ourselves by using a completely different dataset that deals with breast cancer tissue and we achieved equally good results as our core solution that deals with cervical cancer. Which means our network is now generalizing well enough.

What are the current challenges in cervical cancer detection? You mention affordability as being one on your site, but in my conversations with a few pathologists I was told that it is per se not a major problem.

There are multiple levels of efficiency and it’s not just one aspect that counts. When we talk about the cost of a test, it’s not just the price of a pap smear test that a woman has to deal with.

Imagine you had to get some kind of a test done but it is only available in Mumbai, it costs a 100 Rupees but you’re in Bangalore. The cost now increases exponentially because you have to spend money on travel, take a few days off from work and so on. For women in India, it could mean having to leave children and family. And if you’re a daily wage earner, then it would mean having to lose your paycheck for a few days. So it all adds up. If the test was available to you closer to where you live or work it would make a major difference. Accessibility to proper mechanisms drives affordability.

Let me give you an anecdotal reference, I have seen with my own eyes a pap smear test report (along with the bill) costing 2000 Rupees at a panchayat away from Tumkur. The same test in Bangalore would cost you 500-1000 Rupees. If you’re a below poverty line patient, and you go to a center like Kidwai, it would probably cost you 200 Rupees.

So you can see how the problem gets acute in an inverted way. Ultimately, the pathologist who actually who is doing the reporting is probably going to be paid the same amount. The center which picks up the sample had to handle logistics to transport it to Bangalore, there is labor involved in a clinic, margins involved and so on. The cost ultimately gets padded up.

robot-ai-medicine-health

So that’s the economics of the problem. What are the other challenges?

Awareness is a major challenge. And that is compounded. How many of us actually go for regular health check ups once a year? Very few. We don’t go to a doctor unless we’re sick for sure or feel a lot of pain. Or if your employer has a corporate healthcare plan so you get preventive health check ups. Or perhaps your insurance provider compels you to get them done. These are some of the triggers. Now imagine you didn’t have these triggers and you had to travel the distance of Bangalore-Bombay for some basic diagnostic tests. That would worsen your willingness.

Factors such as this compound the problem and have led to India becoming the cervical cancer capital of the world. One woman dies every seven minutes due to cervical cancer in India. These are official numbers. And we’ve been talking for longer than that, so you can do the numbers.

Remember, too, that many people who die in India never actually get the right reasons ascribed to their deaths at all.

Your solution then also provides advantages in terms of turnaround time for cervical cancer detection, obviously…

Currently, if you’re in certain tier-II cities or towns that border them, the turnaround time is around 4-6 weeks. The analysis itself doesn’t take that long, but because it has to be picked up and transported to, say, a Kidwai or St. John’s, who are already having to deal with their internal requirement, and then get entered into their backlog.

With a point of care system like ours, the patient comes in, provides the sample and within an hour and a half, during which she could get other tests done, she will have her report ready. This fundamentally disrupts a lot of things.

cell-microscope

One of your stated aims is to democratize access to healthcare and this certainly seems to fit the bill…

Yeah. Pap smear testing is not something that we have invented. It’s a proven technique that is well tested and has existed for decades. With proper testing and proper usage, cervical cancer rates can dramatically come down as evidenced by the incidence and mortality rates in the US and UK, where screening or yearly or at least bi-yearly tests are done on a very regular basis.

In India, with the population, lack of accessibility, and heavily negatively skewed physician to patient ratio, you’re never going to be able to fix it unless technology acts as a force multiplier. And I’m not just talking about this particular condition, any condition for that matter.

The US has 23 doctors for every ten thousand people and the UK has about 25. India has 7. So this is a systemic constraint.

And you certainly cannot manufacture clinicians overnight either…

Absolutely. Let’s take cervical cancer, for example, which, in India, has around 350 million women in the risk category. Which is more or less the size of the population of the United States. How can we possibly change the status quo dealing with this kind of scale if we keep doing the same old things expecting new outcomes?

The primary health centers which you’re keen to provide your solution to have their own challenges. Such as lack of electricity, non-availability of round the clock internet and so on. Have you factored those in?

Absolutely. Which is why we don’t have the AI algorithms on the cloud. We haven’t built an exotic or esoteric application with the assumption that there’ll be connectivity around the clock or that there will be electricity round the clock.

This is an edge computing device, where the first level of triaging – whether a sample is normal or abnormal – takes place in real-time. For confirmation, we bring in a pathologist for a report on a non-realtime basis. So once the sample is analyzed by a pathologist, only then is the confirmation provided.

What we’re doing is decoupling the need to have a pathologist available all the time for analysis and reporting. From a medico-legal perspective, we still have a pathologist sign off.

pathology-ai

It’s timely you say that. Because one of the questions I had was whether your solutions are built into the hardware, cloud-based or some kind of a hybrid model…

Yeah. Putting it on a edge device is extremely hard because you’re talking about algorithms which require serious computing power. The easy way would have been to put the solution on the cloud, where it would only work in urban centers with good connectivity but that would not serve the purpose at all. So we really had to re-look at every bit in the entire value chain to decide what had to be done.

Are primary health centers the only place where this could be disruptive?

We’re not just talking about primary care centers, we’re also looking at gynecology clinics who don’t do cervical cancer testing because they don’t have the equipment or the resources to do it. They will just refer the patient, if at all, to a pathology lab which itself has a host of problems as I mentioned earlier.

So, imagine a patient walks into a gynecology clinic and the gynecologist has two compact devices on her desktop. The gynecologist collects the sample, does the examination and in a short while the system confirms whether the patient needs further scrutiny or can be sent home. It could be that simplified.

What are the challenges you’ve faced in implementing the AI on your custom hardware?

We had to keep in mind the kind of computing power necessary and figure out the kind of GPUs we need and whether they’re available in the form factor that we desire. Our goal was to put out a model with a smaller footprint device. We had to also optimize how and at what speed the data is processed based on computing ability. And so on. A lot of thought went into it.

aindra-screen

Pathologists I spoke with told me that some of the legacy machines throw up red flags in case of ambiguity in data interpretation, which meant that they would have to manually confirm some of the findings. What kind of manual fallbacks have you incorporated?

As I said earlier, at the first level we do a triage result – normal or abnormal – and we indicate areas of abnormality. We present the findings to our panel of pathologists via our telepathology medium and once they sign off, the report is sent back. The findings of our system are presented to the pathologists, as they analyze, to assist them by calling out the areas where it has detected abnormalities, speeding up the process.

I was told that there might be situations where further data is necessitated from the primary clinician…

Clinical information such as the age of the woman, whether she is married or not, or how many kids she has, whether she has had multiple partners, whether she is menstruating or not etc. is already collected without us being in the picture. This is done by the primary care physician and this data, too, is presented to the pathologist if the system finds any abnormalities. And they can choose to override the system if need be.

A number of industries have seen transformation due to the revolution of low cost sensors. Have you benefited from it?

Absolutely. A lot of things we are able to do is because of convergence of a lot of these factors which have driven down the cost of computing dramatically. If that wasn’t the case, we wouldn’t have been able to make use of our technology. And sensors too, which are crucial to our solution. Advances like these have played a large part in democratizing technology. Which is why I believe we’re in a great point in time. If only India had a semiconductor industry, we could’ve achieved enormously. But it’s an imperfect world.

Broadly, what is the response to AI Diagnostics in India?

There’s a lot of apprehension as well as interest right now, depending on who you talk to. You’ll largely find a bit of both. Some people are threatened with the advent of technology and there are a few people who are able to look at trends that are unfolding and much more accepting of the changes. There is certainly a high level of awareness about the oncoming of AI.

What is the biggest challenge in your view in diagnostics/medical AI?

From my experience of having spoken to various researchers, academicians and clinicians, the biggest challenge globally is integrity of data. That is super critical for AI to become a widespread reality. Since we’re all going to be relying on large amounts of data to make our models as effective as possible, we have to ensure high integrity of data. This means ensuring minimal bias in data because biased data in medical AI is turning out to be a massive problem. Sadly many AI companies aren’t even aware of how important this is – it could determine the success or failure of their systems.

(Special thanks to Dr. Neha Ratan B, M.D. Pathology for her inputs.)

Join 12,000+ CXOs who are reading curated AI news via the newsletter

Subscribe to the AIBOSS mailing list

* indicates required
Total
32
Shares

Sign Up to Newsletter

Daily.