The 2019 RSNA Machine Learning showcase promises to be even busier than last year. The amount of RSNA exhibitors has grown to over 70 and the companies of last year have not been kidding around, they have some interesting things to show.
So how can you get the most out of the little time you have to walk around the RSNA Machine Learning showcase? At Quantib we thought of some out-of-the-box questions you could ask exhibiting AI companies to get the most out of your visit!
RSNA Exhibitor Questions
1. What was your input data for training?
Algorithms do not appear out of thin air. You need input data to develop your algorithm, and what goes in, comes out: the quality of your algorithm depends strongly on the data you use for training. Therefore it is very important to understand the type of data, the amount of data, the data labeling, etcetera, used to build the algorithm. It may seem obvious: the larger the dataset, the more resilient your algorithm should be. However, in practice there are more things to take into account with variation of the dataset being key. For instance, are input scans obtained using different scanners (i.e. different vendors and, for MRI, varying field strength)? Are all scanning protocols equal? Or what is the range of accepted TE, TI, TR in MRI scans? And last, but not least, what was the ethnicity of the patients? Insights acquired based on for example a Caucasian or an Asian database do not necessarily apply to other populations.
2. What options do you offer to integrate your software with the radiology workflow?
Radiologists have to deal with a wide range of applications daily - PACS, PACS viewer, advanced visualization, speech recognition, EMR and the list just goes on. And it is no secret that radiologists are not very keen on adding to that list. Hence, workflow integration is one of the most important distinguishing factors for AI startups in the field of radiology. By asking Machine Learning showcase exhibitors what kind of integration they have in place or they are planning to implement can trigger an interesting discussion between you and the company. You can see what they can do already, and they can see how best to meet your wishes. There is a lot still that can be learned on both sides of the table!
3. How does your software help me do my job?
Aren’t all AI start-ups showcasing at the RSNA because they want to support radiologists doing their job? Check whether they really thought about how they should do that by asking this question. How do they make your life easier? Will it automate something you already do? For example, can you count tissue changes and report progression? Will it provide you with more detailed information helping you quantify what you see, e.g. through volumetry? Can you easily compare follow-up with baseline scans? Does it provide standard reporting ensuring comparable result documentation between exams and between patients? Or does it provide a certain standard, enabling radiologists to compare patient results to a reference?
4. Have you measured reproducibility? If so, how?
Many factors such as scanner type, scanning protocol, patient placement, etc, can influence software performance. Even repeating the exact same scanning protocol on the same scanner with the same patient a second time will yield a slightly different image, hence different algorithm results. Therefore it is of utmost importance to test the extend of influence these factors have on the algorithm performance. Asking a company about their repeatability testing will give you an insight in how reliable and repeatable their algorithm results are.
5. Is your software FDA approved or CE marked?
This one cannot come as a surprise, and might not seem like an exciting question to ask, but nonetheless a relevant one. The FDA approval and/or CE marking makes the software available on the market and ready to use it in your radiology department. Before you get very excited about a radiology software product, ask this question to make sure you can actually put it to work!