false
Catalog
Advancing FHIR in Clinical Registries and Research ...
Advancing FHIR in Clinical Registries and Researc ...
Advancing FHIR in Clinical Registries and Research Video
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Great. Welcome, everybody, to our second webinar in our CMSS Registry Science and Research Webinar Series. I want to thank the CMSS staff as well as Heidi Basley for pulling this together. Our first webinar on July 25th was an overview of USCDI and USCDI+. This webinar will take a deeper dive specifically into FHIR and how FHIR can be used to advance both clinical registries and research. This is a common question we hear a lot from our members about what's happening in the space, what are the implications for the work they're doing in terms of their clinical registries as well as their clinical research initiatives. So we've got just an incredible lineup for you today. I think you'll hear sort of the promise of FHIR, the potential for FHIR, and then some very real-life insights from our colleagues at the American College of Radiology. So let me just introduce everyone to start with. First speaker will be Anish Chopra, who is the president of CareJourney, but many of us know him from his role as the first chief technology officer at the Department of Health and Human Services. And he is very much a standard zealot and someone who I think is trying to show us the way forward. So we're excited to have Anish join us today. He'll be followed by Marty Velaziz, who works with FDA and their Data Standards and Interoperability Group, and she's the carer of their Coordinated Research Network Architecture Task Force for MD EpiNet. And then last but not least, we've got two folks joining us from the American College of Radiology today, Chris Trebl, who is the director of their Data Standards Institute, and then Brian Bielecki, who is director of IT Standards and Interoperability. So we're excited to give you an overview of what FHIR is, what the potential for FHIR is, and really help us better understand what we as specialty societies can be doing to better leverage FHIR and what are the potential roles it can play in us advancing the research work and evidence-based generation that we're also invested in as societies. So with that, I'm going to turn it over to Anish, and Daniel will start his slides. Thank you all. Thank you so much, Helen. I greatly appreciate it. And as luck would have it, just this morning, ONC posted an update on both the promise, meaning a reminder that the Cures Act regulations are going live this calendar year, a call to action that encouraged organizations to call their EHR to ask for instructions on how to access and use this new technology, and then a little bit of a sobering update, which is only about 5% of the certified EHRs as of today have updated all of their regulated Cures Act capability. But on the positive end, those 5% include Epic and Meditech and a few others that have sizable market share. So something like two-thirds of the market share on the hospital system side have the capability we're going to be talking about today. So with that, I think what I'd like to do is, I know it's described as a FHIR discussion, which feels very technical. I want to kind of come at it with policy to understand the context, because it may help explain maybe a disconnect between what you think is realistic for you to take advantage of today versus what is potentially available for you in the future. And so I'm on the value is in the near-term side, and maybe if you're skeptical, this may give it some context. So why don't we dive right in, if you don't mind, to the next slide. I will remind us a little bit of the context here. In February of 2009, we had just seen the passage of the Recovery Act. I was still on the transition team. I had not yet been named a U.S. CTO. And alongside my good friends Farzad Mostashari, Todd Park, Danny Weitzner, who eventually became my deputy, and Dr. Peter Bash, who's the only one of us who stayed at MedStar, meaning didn't come into administration, we crafted a bit of a kind of a framework for how to think about implementation of the HITECH Act so that it would advance the president's objectives. We had five very clear goals that we put forward, and I want to reference those now, because I think it'll help you understand where the FHIR API standards movement fits in context. First and foremost, we acknowledge the historical reality that the lack of HIT adoption had largely been a problem for payment reform, which of course is the fuel for the economic case for changing the way we deliver care, which then would drive demand for more features and functions. And so if you were to think about health care as a productivity function, if you're optimizing for fee-for-service health care, you're not going to get the demand signal for interoperability than if there was an economic model for clinical integration. And as a result, we've been a little bit of this foot-in-two canoes problem where the demand signal on IT has been moving towards optimizing fee-for-service, even though the data that we need and the technology that we need for registries in particular, which is about clinical improvement, is about integration, and that's got a different economic model. So we were framing the hope of the HITECH Act in the context of clear population health goals. Absent that, we thought the program was going to be a challenge. We then wanted to drill down to say that if you wanted to get paid for outcomes, that they would be tied to clinical outcomes, not I bought a piece of software, here's my Best Buy receipt, give me my coupon. We felt that the Medicare Innovation Center would be the ultimate demand signal for what the future workflows would be. And so we wanted a bullpen that linked CMMI and the National Coordinator. We felt that we needed to have a kind of a geek squad on demand so that doctors and hospitals, particularly in underserved communities, could get the support they needed to participate in this technology-driven delivery reform movement. And last but not least, acknowledging that there could be some XPRIZE-like changes that we need in healthcare delivery, that this could be the vehicle to source, validate, and scale the best of them. So the FHIR APIs in the context of where we are on overall healthcare delivery reform, you can see, if I were to kind of grade, did we really move to value-based care as the economic driver of the American healthcare system? No, we're much further behind than we would have liked, which is why we've had this sort of convoluted demand signal. Did we do a good job linking specific incentive payments to clinical outcomes? That remains a work in progress, I think, is the charitable way I would describe it. And have we done the alignment with the kind of future state, the Innovation Center and the health IT world? We probably could have done better. Thankfully, the ONC just announced this, a new HHS policy that said all departments, including CMMI, have to have a FHIR-first kind of thinking, and that they're going to double down on aligning with ONC. So that's very positive. And obviously, we have work to do on health equity and improving the digital infrastructure in underserved communities, as well as fostering new innovation. So not the best grade on where we wanted to be, but at least you have a little bit of context to explain why a decade later, we haven't reached the promised land. If we go to the next slide, now we can shift a little bit in the policy domain, which is to say, what is the role of the government in standards development? So FHIR is a standard. It is developed by HL7, which is recognized as an official standards development organization. In the United States, we have a legal framework that says that the private sector sets the pace and adoption timeline for standards. The government can reference industry standards in regulation, but it can't create standards. Now, I was frustrated personally. Standards did report to me as USCTO, and I asked the question, there seems to be a glaring problem. What happens if you want standardization in an industry where the private sector didn't have a lot of incentive or capital deployed to share? You can't wait on industry standardization if there really isn't a lot of appetite for it, but we need it to be successful on our policy objectives. So I asked our commerce secretary if we could dust off the official rule book and remind ourselves, is that really the role of government to simply wait for the industry to get organized? And if you go back in history, you can see in this graphic, we had a commerce secretary by the name of Herbert Hoover. He was later a particularly underperforming president, but when he was commerce secretary, he faced a conundrum, which is airplane production fell off a cliff when the military no longer demanded planes because the kind of war fighting ended in World War I. And the industry, the airplane manufacturing industry was looking for a bailout because the demand was falling and we still wanted to be a leader in the industry of the future. And he was philosophically opposed to a bailout. So I believe he was an engineer himself. He asked the question, why do the planes you make today, to say it in uncharitable terms, stink? Why are they not successful on their own? It turned out there were two technical problems that the private sector couldn't solve on their own. One, we didn't have a good design for engine cowlings and for airfoils. So he said, well, a good role for the government could be to fund the kind of standards development in collaboration with the private sector. So the beginnings of NASA before NASA was NASA were formed and we invested in an open standards effort. What was amazing about that is the result of that was less the need for a bailout and more the introduction of the two most successful aircraft of the era, the DC-3 and the Boeing 247. They compete vigorously in the market, but they shared the same intellectual property for those core components. We're in the kind of similar moment in healthcare data sharing. We've now made everyone go digital, but we now want to connect the dots so we can actually use all this infrastructure to improve quality, hopefully improve costs as well. Apple Health is sort of my poster child here because while we do have government regulations on applications for consumers, we really haven't had a lot of real world demand. I remember vividly the minute Apple was announced that they were creating this sort of app for consumers, the phone was ringing off the hook of health systems that wanted to be connected, didn't want to be left behind. And CIOs got the message that we kind of need to turn on this infrastructure. Apple to their credit said, I will not make the app available unless the doctor or the hospital is on the FHIR API. I actually give credit that ONC did a great job in sort of getting us to the standards process, kind of like the Hoover story of public-private collaboration. But really, Apple gets the prize because if they didn't mandate the FHIR standard, it's not clear to me if people shipped it, it would actually work. And so the conclusion I have on this slide is for those of you who are not yet satisfied that the state of affairs with industry standards is sufficient for you to incorporate this into the development of your registries, I will say you're not going to be a passive observer in having someone else solve it for you. You've got to be an active participant. I call this the era of handshakes and handoffs. There is bipartisan agreement loud and clear that we need to be doing this, but we need a handoff between the public sector and the private sector, the registry equivalent of an Apple Health to emerge that can drive towards a use case on how these technologies are meant to be used. And so we're in that moment now. Basically, I told Helen, I'm eager to have this webinar because we need to find the registry equivalent of Apple Health that can demonstrate the value of the existing Cures Act FHIR API standards now that they're coming into production over the next 12 months. So that's my philosophizing. Now I'm going to spend a few minutes a little bit about how a bill becomes a law and then the technical options. So if you go to the next slide, I want to kind of walk you through InstaFHIR. Well, we had this ONC electronic health record 10-year journey to interoperability, but on the health plan side, we didn't have any requirements. Zip, zero, nada. So this is the story of compressing the timelines. In other words, what happens if you start from scratch with nothing? Can you emerge with a FHIR-based option that can go live and in what period of time? So there are two stories on this slide, but they all speak to compression of time. The CMS administrator, Sima Varma, gave a speech in March of 2018, and she basically said, I want you, the private sector, to begin developing FHIR APIs on the health plan side of the ledger, meaning the health claims. And many health plans have been collecting clinical data too. So basically, get ready. I want you to take all the data you have, the clinical data and the claims data you generate, and to begin thinking about making it available through FHIR APIs. That speech was followed by a call to action in a regulatory document called the Medicare Advantage Call Letter that came out a week or two, maybe a month later. And I was sitting at the Karen Alliance that I co-chair with Governor Leavitt, and we basically saw this as a window of opportunity to rally the private sector to meet the public sector. So if you look graphically here, the blue is government action. The green is the private sector reaching consensus. So we met and issued a call to action to plans. Who wishes to do what the administrator asked of them well before formal rules are issued? So we're now in late spring 2018, a solid year or so before any official rulemaking called industry to action. The great news is that voluntary effort got enough health plans, the humanas of the world, the cambias of the world, that got the development teams working voluntarily on a framework for how to do it. We released that framework in six months, maybe seven, and we began real-world testing in time for CMS when they proposed the rule a year later to say, oh, by the way, we can't mandate you do what's happening over on the Karen Alliance side, but if you're interested, we strongly encourage a developer consensus approach. And therefore, six months later, we had industry agreement. And so on the White House stage, we made a big announcement that a whole bunch of plans and stakeholders agreed to work on the emerging FHIR standard for payer data, what was called the patient access API. And we ultimately got it balloted by February of 2020. Bottom line is this, without the private sector matching up with the public sector, we would not really have a standards option for health plans. But because we worked in tandem, by the time plans went live in July of 2021, that was the date, that was the deadline for them to go live. It was obvious without the government having to force it that everybody would simply choose the patient access API to comply. And in fact, all 1,000 plans, the vast majority of whom are now live, all standardized on this infrastructure, which was great. My point is, if there are registry functions that are needing more information than is regulated currently in USCDI, take comfort in this slide that it can be done with a voluntary effort without a billion-dollar subsidy to underwrite the thing. I think we did this on a shoestring. Basically, everybody came hat in hand and just sort of made it happen. It wasn't a huge expenditure. The second example is on the bottom right, a little bit about the sort of COVID vaccination world. There were no standards for how a consumer could validate that they had the vaccine or any health information on their phone. I mean, you can have a PDF, you get a picture of the vaccine card and store it. None of it really felt like it was formal and official. Well, we as a community rallied around the use of FHIR APIs to share proof of vaccination. Very simple use case, very narrow in scope. And in months, we had all the major EHR vendors and stakeholders agreeing. And today, I fly United a lot. My friend Atul Butte at UCSF tweeted, hey, great work. I can do a great deal of this. I can kind of put right natively in my United travel app, the FHIR health cards that are sitting in my Apple health service. So without having to shift PDFs or scan documents, I can have my proof of vaccination on my Apple phone, and I can share it with United, all frictionless, no fees, and it works all in a standardized way. So that's the backdrop. And Helen is kind enough to remind folks that I'm doing a bit more of, I'm more of the lecture guy in the session today, which is going to be a little bit longer than the average presentation. And then my colleagues are going to follow, and we'll have a bit of a Q&A. And by the way, I'm very active on this chat function. So if you do have questions, comments, reactions, feel free to put them in. I'd be delighted to respond. Okay. Now onto the technology. So what is FHIR? Where are we going with it? How do we make it work for our organization? Now we'll get a little bit more into the deep dive. So if you don't mind, I'll, I'll go into the first option here, which is the consumer FHIR API. If you go to the next slide. As you can tell, I'm a pro Apple fan, but know that Apple is not the only open source option on the playing field. My good friends, and I'm kind of a volunteer advisor to common health, basically replicated the Apple infrastructure on the Android ecosystem. So we have on both major platforms for phones, which are ubiquitous throughout our society. There's a no fee option to get your regulated health information onto your phone. Ironically, common health is now ahead of Apple because it went and began pulling in the claims data. Cigna is live where you can put your Cigna claims data into common health, which is a feature that's not yet launched at Apple health. So what does it mean for the FHIR API to be a kind of an open standard for clinical sharing networks? Well, as luck would have it, Apple and Android have been doing the painful work of last mile connectivity. Apple and common health are both connected to about 800 clinical sites today and growing. It costs nothing to connect. It's really more of a education. Like if you're a health system and you're listening to this webinar, it's not like this auto magically happens. You have to flip a switch in your Epic system or your starter system, and then it happens free of charge, but at least you turn on the functionality so that the content in your systems can be made available when a consumer logs in with their username and password. Apple has been very kind to publish a list of all the sites where it's live, and that's where you see the screenshot. And by the way, if you're wondering, is this only for the rich and famous, please note the very first person on the list when I pulled the screenshot, Richard M. Adams of Family Foot Care of Texas, I believe a podiatrist. And Helen, I don't know the average podiatrist's IT budget, but I'm guessing it's not a multi-billion dollar item. So if Richard M. Adams can do it, I would imagine so too can you. Go to the next slide. I wanna walk you through the second wave. That consumer wave went live with the 2015 edition of Meaningful Use. And at that time, it didn't name FIRE, but because of Apple's leadership, it was de facto a FIRE option because the only ones that went live basically did so with Apple using the FIRE endpoint. So we were lucky. So now we get to the Cures Act, signed President Obama, inspired by President Biden, massively bipartisan in the House and the Senate, signs into law December of 2016, calls for APIs to be made available in healthcare that can be connected without quote-unquote special effort, meaning a plug-and-play architecture that resulted in a Cures Act regulation sponsored by ONC that said we want the same content, the FIRE data model for the U.S. Core Data Elements for Interoperability, or at that time it was called the Common Clinical Dataset, MADS, labs, problem lists. And we want to extend that from consumer access to physician access, and momentarily I'll describe what it means for the population-level access. And so now the regulations are coming online. I'll have a more formal slide later, but I'll just verbally say it to you today. The timelines are as follows. EHR vendors have to certify and provide this functionality no later than December of this year. Doctors and hospitals, if you are trying to collect your MIPS quality bonus from CMS, have to turn this on no later than September of next year. So let me just show you a little bit about, you know, how does big tech play in healthcare? Well, they're building smart-on-fire apps. The Apple Health app, which was launched a year ago, is for physicians to download free of charge as a tab. This is a screenshot inside Cerner. No cost to integrate, no cost to Apple, no cost to the doctor. Of course, no cost to the patient. And it allows that doctor to see the Apple Health content in context to the medical record that they're operating in. Similarly, Microsoft has launched a Microsoft Teams connector. So if you wanted to have a virtual telemedicine consult, it can integrate the details of that consult from the EHR system and your Teams system. So you can integrate if you have multiple documentation systems. So it looks a lot more clinically integrated than maybe if it wasn't all built on one EHR. And you may have read that Google Health has launched a Google Care Studio product. And while this has not been officially launched as a smart-on-fire app, that is a lot of what they've been announcing lately, are their sort of smart-on-fire capabilities. And so I'm just putting it here more as a reference point. It allows you to do that kind of longitudinal patient search function, which has been a little bit more of a hassle in your traditional EHR to just kind of think of a Google search function natively operating within the environment. Go to the next slide. I can walk you through a little bit about how we go from here to there. And obviously I'm very Pollyannish and supportive and bullish, and I think we're gonna be making great progress in healthcare, but I'd be remiss if I didn't acknowledge that a lot of the policy objectives of late have been hindered because of a lack of success in interoperability. So we've kind of had to go through the kind of the, I don't know what Gartner would call it, the disbelief trough, before we get to the steady state. So one example, CMS announced that it wanted all doctors in risk contracts or accountable care contracts to generate electronic quality measurement reporting out of their EHR this calendar year. And the trade association NACoS was sort of, I wanna say vehemently opposed, but maybe politely explaining that it's a huge technical burden. In fact, I'll be addressing the NACoS conference in September, the average ACO was told to spend an extra $250,000 per ACO to upgrade their IT systems to meet this electronic quality reporting, dead on arrival. So CMS had to cave. And in the final minutes of the regulatory season in November of last year said, fine, we give up. We'll delay the regulation for three years because IT isn't ready. It generally stinks that IT blocks the policy objectives of an administration. And unfortunately we find that a lot of those things have happened. The surprise billing legislation, a lot of you are grappling with how to deliver an advanced estimate. You know, like the, if you go to your Jiffy Lube and they give you the estimate, now doctors have to produce that Jiffy Lube estimate inclusive of their fees, the hospital's fees, the anesthesiologist's fees, the post-acute care fees, all that has to be itemized in a good faith estimate. And honestly, the industry just said, we can't do it. It's too hard. So, you know, you can see here, the FAQ state that the complexity of this requirement makes it virtually impossible for providers to comply by January 1 of 2022. Therefore the departments have had to defer enforcement of this feature until future rulemaking. Just think about this. Surprise billing has been like a top political issue for five plus years. We finally get consensus in Congress to do it two or three years ago. And it's IT that's getting in the way of us not screwing people. I mean, this stinks. I can't imagine that, you know, IT people feel good about themselves stalling a critical consumer benefit in one of the most contentious and egregious parts of our healthcare system and IT can't make it work. That's the problem? Ugh, frustrates me. So this is the trough of disillusionment, right? This stinks. By the way, I'm working on all these things. There's no way I'm tolerant. This is like, this is unacceptable. So if you go to the next slide, you can sort of see a path forward. And the path forward ONC announced in May is to kind of create a trusted exchange program or an HIE-like network, but built on this modern fire-based technology, which is different than the clearinghouse technology that has been in place in healthcare for a couple decades. So what I want you to be focused on is that if you were looking to be a part of a team to solve this, like be the Apple Health, what you'd wanna do is participate, if not even propose a Tefca Fire pilot. Helen, I think a no-brainer option would be to have by October of this year, one or more of your fellow registry partners to submit a kind of a pilot proposal and to have ONC announce it, they'll be announcing them by the end of the year so that you'll have ONC support tied to our use case. Unlike the traditional HIE clearinghouse model, this one is more akin to a traditional point-to-point interface. So the registry quote-unquote app can connect to each of the doctors and hospital EHR systems. And simply the role of the network would be to tune it, operationalize it, not spaghetti diagram, connect all the IT. That's where all the costs are in the traditional clearinghouse model, paying for all these wires. In this case, you're just managing the authorization of a set of apps. So to go to the next slide, I'll kind of walk you through what that looks like. I think we're gonna see an era of network fire stakeholders. And these stakeholders could be one of three flavors. One, HIE+. Some of the HIEs that you interact with may be willing to add a fire product line in their SKU. And it's good because they have governance and it's better because they could kind of siphon off the legacy technology and kind of put a box around the new technology and simply add it. So you get the economic advantages without wasting all the infrastructure we spent all this time building. I think more likely will be a bunch of new entrants. And you can see a lot of companies have emerged well-capitalized that wanna build network functionality supporting the fire standard natively. And I don't know if any of them have talked about connecting registries yet, but it's such an obvious thing for someone to do. And then last but not least, the first two involve kind of internet technology, meaning you're talking about people across organizational boundaries. I also see an opportunity for the same network functionality to be run intranet. So if you wanted to build something within your enterprise but wanna reduce some of the burden, I mean, I can't even tell you how many FTEs at hospitals I know are manually re-entering data into registries, like that whole world goes away. And you could actually facilitate a kind of in-hospital burden-reducing fire network that pre-populates all the content with US CDI and then works together to fill in the blanks where needed. All of this fits in the backdrop of regulatory timeline. If you go to the next slide, this is that timeline, a quick reminder. This is, and again, all these slides, I assume Helen, you'll have this as a takeaway PDF so we could share whatever you wanna do later, but this is all open for anyone to reuse. So don't feel like you have to take notes on your computer. But the punchline is as follows. The window is already open for applications for TEFCA, that TEFCA FirePilot can take applications all the way through the fall for announcements by the end of the year. We do think, we do see one more stage of interoperability coming in a few months, October 6th. That's when consumers will have the right to access their entire electronic health information, not just what's in the kind of Apple Health menu or US CDI. That EHI kind of open Komodo will put more pressure on hospital systems to release more data and more formats through a FHIR API. And it might be the FHIR API is distributing a PDF documents, by the way. That may be how we close the gap on things that have not been mapped. But that goes live in October. As I mentioned, all the EHR vendors not only have to certify their technology by the end of December, they have to physically ship it, provide it to the doctors and hospitals that pay them. So if you are in a health system today, you should check your watch and ask yourself, when is my EHR vendor shipping me the Cures Act functionality? And what training are they making available? So I know what the heck to do with the thing when it comes to my doorsteps. And by the way, three, I might want you to proactively email them to find those things out, because you may not be getting that information from you. That was in fact the headline that ONC put forward in its blog post this weekend, basically saying, ask your EHR vendor about Cures Act timelines, because it is not obvious to me this has been widely advertised. The Cures Act must be deployed by September of next year for financial purposes. If you're a hospital, that's in the final rule. And then it's been proposed in the physician rule expected to be made final by the end of the year. A year later, we'll get to the Cures Act API where that same content that goes live in October 6th has to be made available in a more technical machine readable format. So between October 6th and 1231, the EHR vendors have to find a way to get that blob of content out the door. Again, no obligation to do it in FHIR unless we sort of work together to demand it. And I'm working on that too. Meanwhile, price transparency has been live. That shoppable services rule, how much is it going to cost for that episode of care? That goes live for 500 shoppable services this January and will be deployed for all services by a year from then, January 2024. Just a few more slides, Helen, I promise, and then I'll hand it off. If you go to the next one, I could maybe give you a little bit more context for what the demand signal looks like for this functionality. Assume that those of you who are listening know some of this, but may not have taken a much action on this. What are the areas that other departments of government are paying attention to? So obviously, as you know, President Biden on day one issued an equity executive order. Health equity was high on that list. There's an equity data working group. And so in the summer of last year, they published a pretty sizable menu of things that need to be done to collect race, ethnicity, social needs, all of it to help us do subsegmentation reporting. And so to kind of give you a little bit more of a window across all the agencies, I see that population level API not only as fuel for the clinical registries, but could be used for the cancer moonshot. As I said earlier, for the quality measurement systems, the CDC, the Medicare Innovation Center, they're all focused on social determinants of health. I think the fire questionnaire is gonna be how we're gonna collect areas like patient reported outcomes or social needs screening or suicide prevention, PHQ-9 mental health screenings, any screening instrument that's not currently regulated can possibly be, there can be an extension of that using fire questionnaire. Smart on fire apps for prior auth, I think we're gonna see a new rule for health plans that they have to make available to doctors when a particular patient needs a prior authorization. And then what content does the doctor have to present in order to get to an adjudication? Look for those rules to come out later this fall. I mentioned the EHI export, it's probably gonna be helpful for disability determination or the NIH precision medicine, all of us program. And then last but not least, I think the CDC is gonna move towards some kind of publishing model where hospitals can simply publish a case report files to the CDC. So pay attention, this fire API discussion we're having for your benefit will actually meet a lot of benefits across federal agencies. And I think last but certainly not least, I wanted to, I would be remiss if I didn't acknowledge that we do have a non-technical challenge that has to be addressed simultaneously. If you go to the next slide, final slide, and I'll hit my time right on the mark here. We've got to make sure that transformation runs at the pace of trust. So for those of us struggling with opening up all this information, you remember the Google Ascension headline, quote, Google's totally creepy, totally legal health data harvesting. I, that may not have been a very fair headline, but it was the headline. So I've been, Reese Robinson and I coauthored a piece that called for a digital Hippocratic oath, explicitly trying to help us constrain in the industry kind of before government regulators mandated some code of conduct on how all this open data is gonna be used. And obviously with the emergence of AI and clinical decision support, I think we should borrow heavily from the Defense Innovation Board on how to embed ethical guidelines when we procure or build AI decisions for algorithms on all this data. So Helen, that's a tour of the FHIR API ecosystem, really bedrock in policy more so than tech, because that's where we are. Back over to you. Incredibly helpful. And you shall see, there's a couple of very specific questions for you that we'll come back to about the timing of FHIR API, the requirements for bulk FHIR, et cetera, but meaning just for the sake of time, let's move on and Marty, I'll turn to you. Okay. Do I need to speed up at all or? Oh, we have till 2.30. So I think we're okay. Okay. Yeah, I think we're fine. Yep. So are my slides coming? All right. I am going to be talking about the Coordinated Registry Networks and some of the work that we've been doing with the FDA and the MD-EPINET program, which is a public-private partnership. You can go to the next slide. And we're going to talk specifically about leveraging real-world data from existing sources. Go to the next slide. Should have been world. All right. So we know that we have a large amount of data, right? Data's everywhere. We can't get ahold of it. And introducing Coordinated Registry Networks is supposed to alleviate that, Coordinated Registry Networks is supposed to alleviate that issue that Anish was talking about, which is collecting the data again and again and again. So how can we leverage the existing data from where it's collected, relying on the interoperability data standards to ensure that the data that we get is correct and reliable for our uses? So that whole concept of reusing the data coming from where it is existing is what the Coordinated Registry Networks are all about. We have 15 different subspecialties, if you will, across MD-EPINET working towards how do we get data from where it is originating and how do we bring it into the research realm of the world, right? So we need systems to enable that, systems that are interoperable, ones that are going to mitigate any information overload for the clinicians and the providers, as well as to reduce the amount of re-entry and data errors that we see going on. We also want to focus more on the patient outcomes side of things. And so we'll look at how we're doing that within the Coordinated Registry Networks. And then being able to provide that information to patients and healthcare providers, researchers and regulators for their particular downstream uses. Obviously from an FDA perspective, we definitely want to use that and bring that into regulatory decision-making. So next slide. The Coordinated Registry Networks are all about being part of the solution and modernizing the infrastructure. So this is just an example of how do we actually manage that across all of the Coordinated Registry Networks. Again, we have about 15 subspecialties and each one of them is working towards creating the environment in which data is coming in from all those data sources. So this is the maturity framework in which our Coordinated Registry Networks are evaluated on. And the important things here are on the left side, our maturity criteria are really what they're being gauged against. So the top intentionally is our unique device identification being the most important as to tracking the actual devices that are used or implanted in patients within these Coordinated Registry Networks that are mainly based on medical devices and then engaging patients and incorporating patient-generated data. So what are we doing in the registry to incorporate the patient voices? Data quality. I think data quality and our data collection efficiency really go together is how well we are getting the data from the sources and does it answer the questions that we're looking for and how many of our sites are actually collecting it consistently in the same way. So it really gets back to how much the data standards we're using. Governance, sustainability, healthcare quality improvement, this all kind of ties back to our agendas around how to leverage the data again and again for multiple uses and reporting requirements. And then for the last one is a total product lifestyle. Can we actually track from beginning to end how these devices are going through their product life cycle and from obviously research and development all the way through to post-market and patient outcome surveillance, right? We wanna make sure that they're safe and effective. We take all of these and we evaluate across each one of our CRNs to identify where they are in the level. So you can see in the center column where you have levels one through five. So an example of that is if we took data collection efficiency at level five, the criteria for that are that technologies are in place and that structured data captures coming from the EHR or mobile apps for all of our core minimum data elements and that we have a full adoption and integration of the data. So that's kind of at the top level across the 15 CRNs, we obviously have different maturity levels within each one of them based on where they're at. But on the next slide, we'll go ahead and look at three examples, I believe. Next slide. I'm gonna skip forward, yeah. So where we really are focusing on our unique device identification and device usage. So we know that devices exist in the world in which you work and at a magnitude where the data being generated by those devices or with those devices is exponential, but you only need specific data to do your actual job. So how do we actually accurately identify the devices that are being used from beginning to end, right? At some point they might, may or may not introduce safety issues, but when they do, we want to be able to go back and look and identify them. We also wanna know them from a patient outcome perspective. How well are the patients performing with a particular device? And what are all of the data that have been generated around the use of that device or implant of that device that we need to track? And then just, again, what you might be concerned about is supply management and cost containment. There's a lot of things you can do if we can accurately identify the devices that are in use, really looking at some of those other uses of the data that would really give you some power and analytics. And then devices, we know where devices are going and the technology and all of the work that's being done around digital health and communicating devices. I think in the chat some, or in the questions, there was some questions about what we're doing with glucose monitoring. I'll pop up just really briefly, but again, these devices are communicating more and more data and we only need to communicate real time what is available, what needs to be available for the clinician. But there's a lot of other metadata that can be collected and be used for evaluation later. I think that, so, yeah, how do we collect and transmit and store all of that data? And what are we doing to enable that from a standards perspective? There are a lot of things that are coming out in our use cases that go above and beyond what is being collected today in the clinical environment. And we do get some pushback, both from clinicians and implementers on why that data is necessary or why it needs to be transmitted. But we'll look at some of those use cases in the next slide. So for what we're doing with FHIR in MD EpiNet CRNs and how we're leveraging it, our vision, our vascular implant surveillance and outcomes network has done quite a bit of work on integrating and collecting the unique device identification for vascular implants and being able to go back and look at that in their registries. They're also looking to improve post-op outcomes and follow-ups around their imaging studies. And how are we going to leverage FHIR to do that? And I kind of put that in here because your next speaker will talk a little bit more about that space. And we're doing a lot of groundwork within FHIR just at the resource level to make sure that the data that we want to collect is available in FHIR as a even commonly collected piece of data because if it's not, then everybody might do it differently. And so, although FHIR is very extensible, you'll hear that word and you can pretty much do anything you need from a use case perspective. If we want some commonality across all of our ecosystems and our clinical organizations, we've got to start with the base core FHIR as well and make sure that it is meeting the needs and standardizing those things that are common to all of us. And Anish had referenced some of those coming from ONC and USCDI, but there are quite a few other ones that are coming about from the FDA, CMS and CDC around communicating devices and device settings and device metrics that are not there today. And what are we doing to enable that within FHIR? Our women's health technology, they have a working FHIR implementation guide which outlines the common core data that's required for various different registries within women's health technologies. They developed a smart on FHIR app that demonstrated how the IG would work. It goes back to maybe some of the data collection pieces where it does work off the structured data capture that Anish was talking about in those questionnaires because they're standardized. And so they will just bring in the data that we know is consistently collected across those clinical domains. And then lastly, we're working on venous access specifically around venous access insertions and really trying to find the sweet spot of how much data is too much data about the actual catheter to placement and some of the outcomes around risk of infections and what data we can collect for that. The important thing about that is there's two things that are really forthcoming to be standardized within FHIR which are really the anatomical location and documenting that with a little bit more accuracy. And then how much data do we need from device identification that we can either get from the FDA's Global Unique Device Identification Database or from manufacturers themselves by just having access to the UDI or UDIDI which is the device identifier. And with that, we can get to a lot of device characteristics. So we're really pushing the edge on what is too much and what can we leverage from existing databases. Again, from a coordinated registry network we wanna be able to do that because we don't want you to have to put all of that data in your records, in your EHRs. We wanna be able to go harness it. But what we do need in your EHRs is the unique device identifier which in some cases are being enabled through barcoding and other cases it's data entry. And so we're trying to increase the adoption level across EHR vendors that doesn't exist today. And we know that. And I think the next slide is just our why. We care about this greatly because of patient safety. We also know that it's going to help improve the delivery of care and improve our understanding about patient outcomes. There's a lot of data that's kind of put into electronic health records that we don't have access to or takes a lot of data mining. And the more structured we can get and leverage FHIR to extract this information, the better off we're gonna be. How do we do this? I'm just setting this up for the discussion later. We have to do this together. As Anish said, we need to engage all of the stakeholders. We can't leave anyone out because when we leave people out we lose the requirements for the standards that are being built or even how the standards have been able to fit all of our needs. And we need to utilize standards. I mean, I think what we're here today to really speak about is there are standards that will make this easier for us in the longterm. And we need the contributions of everyone to make sure that the very complex ecosystem that we live in although we're working at it from a registry context you might be working at it from a provider a day-to-day perspective. All of those things, if we work together we're gonna have better standards built that can manage all of our needs and not just one of our needs. And I just wanna leave you there. And from a medical device perspective, medical devices are getting smarter and they're containing and communicating more data by the day. And we need to be able to work in a very fast and interoperable way. Otherwise we're going to go back to everybody has their proprietary communication protocol and we're not sharing data. So if we can work together, we will be able to catch up with the demand of the technology that's kind of being developed around us. So I will leave that there for the next speaker. Wonderful. Thank you so much, Marty. That was really helpful. Interestingly, I was at a meeting last week with Rob Califf who also was hammering home the importance of us also figuring out how to do this around post-marketing surveillance for drugs as well. So I think this is a bigger question. Obviously we all need to work together to make these standards work including the device identifier. So with that, let me turn to Chris and Brian to give us a sort of on the ground registry perspective from American College of Radiology. Thanks. Great. Excellent. Hi everyone. I'm Chris Trumwell from American College of Radiology. I'm going to try and give just a kind of a simple little case study. And I try to focus on a simple one here. If you want to go to the next slide. About some of the stuff we're doing with registries. So just a quick background on some of the things we do. The same we have called ACR Connect. Just to understand what that is. And then I think a really, really simple example we have of trying to incorporate FHIR to our registries and where things are relatively straightforward and where there were some unexpected challenges. And then I'll pass it off to Brian who will be able to talk a little more about some of the things we're pushing on for the future of where we want FHIR to go to make registries and some of the stuff we're doing a little bit easier across the board. Next slide. Starting with just the background. Like a lot of other folks, I assume on this webinar we run registries. We run ours under the Muckier National Radiology Data Registry, or NEARDEAR, because it's near and dear to us. And it involves a portfolio of things like our National Homography Database, our Dose Index Registry, General Radiology, Clinical Physician Report, and so on. Next slide. And typically there's three ways that one of our members of participating sites can submit data to a registry. So the first one, the simple one of that they just extract from one of their local systems or fill in a form and submit it to our online portal. That's always the most default way. It's always the most labor intensive way as well. We also typically have APIs that if they have a certified vendor, they could submit to. So for instance, NMD is a very common one for that, where if they have a risk that hasn't been certified to be able to submit data to our registry on behalf, they usually have a much nicer way of doing it as well. And then the last one we have, there's a thing called Triad, one that we developed, and that's what I'll kind of dive into. So Triad really originally developed for our clinical trials that we do, our multi-center clinical trials, where the goal of it is to help facilitate transport DICOM or medical imaging data up to a centrally to handle. And that had a lot of success over the years. And we started to incorporate that more and more to our registries because of the same thing of what we were seeing with, say, the vendor APIs where the biggest problem we have with our membership, the other participating registry is just the overhead of participating. So the more and more we can automate that, the better off we are. And that was one of the things we could do with Triad. Next slide. Where, because Triad could talk DICOM, so to speak, and what it was doing in clinical trials was helping gather up cohorts of data, anonymizing it, validating it, and setting it up to the ACR, we were able to extend that out to essentially do the same thing for our registries that were primarily DICOM-based data. So Dose is a really good example of this, where in our Dose index registry, basically all of the data is incorporated, is already in DICOM objects. So we get to help folks essentially set it up once, configure to have their VNA or their PACS and the data to the local Triad install. It would anonymize it, just do some basic validation and send it on up. So it became almost like a set it and forget it. There was an initial buildup, and now people could participate with almost no overhead whatsoever and became one of our most popular registries doing that. Next slide. And that leads us towards this program we have called ACR Connect. And really what that is, is trying to take clinical data in, because we kind of really went on on, or as much as we really could with Triad, was what data can we pull from DICOM? So for us, to some degree, DICOM is kind of a solve problem, but what's not a solve problem for us is involving clinical data and marrying the two together. And how we had built Triad, it was a bit long in the tooth, but really it was more aimed at being a DICOM relay system. And we want something that could really do a lot more localized processing and handle marrying the two together, DICOM data, clinical data, normalizing, and so on. So next slide, please. So we're currently in the middle of essentially rolling out ACR Connect as the next big version of Triad. And one of the major benefits it gains us is now not just only talking DICOM, we can also talk FHIR and a few other things as well, but I'll focus on FHIR today with the local clinical systems. And that gives us a lot of power to help automate and really reduce the overhead down for our members to participate in registries. So with that background, I just want to jump into a really, really simple example of us working down this pathway. It's almost dead simple, simple example, but I think it hits a lot of the key points that you'd see as you get the more complex ones we're working on too. So next slide. Actually, you just go two slides forward, we can skip that one, yeah. So in our dose index registry, basically collecting across your various different radiological exams, how much dose are you getting off on an exam? You can do benchmarking and comparisons and so on. And one of the big ways we wanted to extend that in the past was also collect the reason for exam, because that can also have a big impact on how you might want to slice and dice the data and look at it. Problem we've always had is that reason for exam is it's technically can be in DICOM objects, but it's not always there, it's usually not. And when it is there, you can't really trust it as the actual source of truth. It's typically two or three times abstracted down to be put on a screen from a radiologist so they can look and glance, as opposed to something more concrete, like a SNOMED code or an ICD-9 or 10 or a custom code or so on. Usually that kind of information actually lives on the procedure or order. So it's going to live in your wrist or your EHR, which means clinical data and clinical data for the most part means HL7V2 or FHIR. And FHIR makes the most sense for us for how we end up running because FHIR gives us a query retrieve model where we can basically say as these DICOM objects are coming to us, we can then query for this individual data that we actually want and retrieve that back to build a package. So can you go to the next slide, please? And then actually advance about six times. I had some quick animation on here, but I'll just quickly queue it up and walk through it. So hit next, I think five times. There we go. So the way it ends up working then is that the PACS or VNA would send us the DICOM object first to the exam that happened, ACR connect would get it. We basically, from that, we pulled the MRN, the session number. We can then go to the EHR wrist where we're connected to, do a query on the service request object, pull that back. And then what we end up doing is normalizing the data, anonymizing it. So normalizing doing things like localized mapping and so on to make sure that we're all set to go, anonymizing it. So making sure we're always mapping the same runs over. So we have crosswalks to go back, making sure that we're not sending any PHI up the chain that we don't need to be gathering. And then next slide, two more times. I think I have two more in here. Yeah, one more. There we go. And then we package it on up and then send it up to our registry centrally. And we're all set to go. And this actually gives us the benefit then of just like before, it's a one-time setup. And then from this, we can now essentially run automatically going forward with of course the exception of errors and we can handle that in different ways too. So next slide. So if you want to walk through how we did this, it was actually really interesting where that from a technical standpoint, it was incredibly straightforward to do this. From our developer side of things, I think it maybe took us two days worth of work to get up and running. And most of that two days was probably actually just a lot of boiler plate stuff to have it all happen. So once we actually got it up and running, we started testing with public servers out. There's a fair amount of public fire servers. It's really nice to make sure that you're hitting things technically the right way and some variation. Our real issue actually came with when we wanted to move beyond that, we wanted to start working with some of our power users to actually get some real world data from the organizations. The last mile kind of stuff where things tend to be a little bit different, the way data is encoded a little different, how they extend it, customize it and so on. Next slide. And so that's why I think really led us to what I call the current state for us here. Next slide. And I was trying to find a nice way of putting this, but I just kind of put it down as organizational overhead. And this was just a fun one we were talking. So these were, when I say power users, these were the big organizations, right? That had lots of resources. They've been on the cutting edge of technology for a while. They're academic. They want to be interested in this kind of stuff. And we found a couple of key things here. One is that fire still tends to be in the quote, unquote, new for IT management. So dot com, HL7, V2, there's processes, procedures, people have been doing it for decades. You set up a new firewall, people have been doing it for decades. You set up a new fire interface. Also lots of questions get asked that are varied on how it plays out. Who needs to sign off on it? Who doesn't need to sign off on it? How does it need to play? It tends to be a little different, not quite as stable. And a lot of the current processes that folks are using out in organizations in the wild today tend to be very much made for clinical applications, right? So I'm trying to marry these two vendor systems that I bought together and how it plays out. So there's certain expectations that arise with that versus say working with a registry like ours, it's just a bit different in how it plays. So it's a little bit, doesn't quite fit. And then the last one, I was also trying to come up with a nice name for this one, but I call it the agreement labyrinth. Next slide, I'll dive into that. This is one that we suspected, but I don't think we quite expected to be as quite as impactful as it quickly became. And there's a little bit of background on this too, but essentially what this ends up being is that back when FHIR was kind of first taking off and it's been addressed since then, but originally it was behind a lot of paywalls for vendors, right? You had to pay the vendor to get access and how it played out with it. That's since been resolved, but there actually, I think, are some lingering aspects from that. So that also around the same time, a lot of these different vendors set up developer programs where they say, hey, join our developer program. There's a yearly fee, there's different aspects. Those programs still exist. They give you more access than just FHIR, but they're still out there. What we actually found as well is that a lot of organizations started to around the same time and has or continue to do is adopting this policy of you can't talk FHIR to any of our local systems unless you're part of the developer's programs. End of story, that's the policy. So it doesn't matter if FHIR is free to work with us or not. If you didn't join their program, you're not gonna communicate on FHIR with us. So it kind of actually put us back in this interesting square one point of view of, okay, so now we gotta actually go back and pay in these different ones to pay out. And we're working with different vendors on how to play out with it. But one of the really interesting that it sent back to us, and there's actually some vibe, there's some benefits behind joining those developer programs too, don't get me wrong. But it was actually interesting, it goes back to that same aspect. A lot of this has been under the same standard nomenclature of this is for a vendored system talking to a vendored system and how it plays out with them. There's lots of money exchanging hands and how it plays, not so much a operational running cost kind of deal. So a lot of these programs are going through that actually don't make a lot of financial sense, at least for us, where we're trying to run our registries at cost. A lot of them do things where there's a yearly fee, there's fees on top for in terms of how much you're going after, there's fees on top of what APIs you're using, and then there's fees on how many APIs you're hitting at a particular organization at a particular time. So as you scale up, if you're a vendored system for profit, it makes total sense, it's like a pay as you go. If you have a flat fee operational cost, it actually almost makes it worse where the more popular your program is, now you actually start losing money in how it plays. So there's this interesting mix in how it plays. And of course, organizations will always make exceptions. If they really want to, they'll make exceptions to go around it. The hard part you end up finding, that we ended up finding quite a bit, is you don't really scale your registries out then if you have to make exceptions at every single organization. You typically find a good handful, couple dozen, maybe 20, 30, 40, where you have members who have enough prestige or pull out those organizations who really care about this registry and doing it, that they'll help shepherd it through. And you kind of don't get to go beyond that because it's just so much overhead and work for the members to try and push through these hurdles. So that's actually one of the bigger ones we're looking at and how we can work through this. Now we have other reasons we need to hit some stuff too, but next slide. I'm just about wrapped up for my portion. I'll hand it over to Brian. So the other part too, just in terms of, I did a really, really simple example. Okay, there's other ones we're working on as well. But a lot of what we're seeing in terms of what FHIR support is out there today from a registry standpoint. So again, and this makes total sense, right? Most of the FHIR APIs and standards have been adopted, have been developed around clinical workflows or at least operational workflows. And oftentimes you can reuse those for research or for registry as well, but it tends to be a little bit of square peg and round hole. Sometimes it fits, sometimes it doesn't fit. A good example of this is like our CDSR registry, which is clinical decision support. We actually tend to try to look for a lot of operational data that if you were system to system in a clinical workflow or even billing workflow doesn't quite matter so much. So it's not already in FHIR, but things like, we wanna know who's your appropriate use criteria, clinical support system vendor. What was the criteria code that was used? Was it marked as appropriate? Was it not marked as appropriate? What was the unique identifier for that check that got assigned to you from that vendor as well? So there's lots of those little things like that that also just aren't part of it. And that's some of the things we're thinking about going after. So I'm gonna pause there for my section of just an example of what we're doing now. I'm gonna toss it over to Brian for some of the stuff that we're trying to press on for FHIR in the future. Thanks, Chris. All right. I think my slides are right there. Yes. Okay, so I wanna take a quick step back and make sure everybody understood what FHIR was, how it was used, then understand what is needed to implement some of the solutions that we're talking about, and really show that the technology does already exist, but implementation details and some of the political hurdles that remained are the biggest things that we need to overcome. So next slide. All right. So FHIR really is made up of a bunch of small little resources. They contain structured data items. If you think about them as like Lego pieces, they're all built and have a special purpose and they're designed and meant to actually fit together to build something else. So some of these items are a patient resource, which gives me the demographics about the patient, a service request, which actually tells me that there's something that I need to do, something like a blood test or a biopsy, or in the ACR's case, a CT or an X-ray. There's an imaging study resource that represents all the content of a DICOM imaging study, lists all the images that were produced as part of the initial service request for a specific patient. Then there's observation. Those are the measurements that are made about a patient, something like a blood pressure, a blood count, the patient has a fracture, things like that. And then observations are put together in things called diagnostic reports. Those diagnostic reports are interpretations or the test results that are put together. It's one or more observations for a specific patient and potentially for that service request. So fire resources, again, being like Legos, can be reused and put together in endless combinations. They can make new and more useful things. So next slide. When we're talking about using fire resources, fire has come out with something called Firecast. This synchronizes healthcare applications in real time. So some of the things that were talked about with Apple Health being embedded in your EHR or the EMR and the PACS working together. That is a result of some of the work that was done from Firecast. And then there are Fire IGs or implementation guides. These are built by standards bodies or vendors or clinical societies like ourselves here. And they talk about taking those fire resources, defining capabilities to come up with a use case. So if you look on the right, all of those little boxes, those are individual resources that are put together in a very specific way, almost like that little guide inside the Lego pack, which tells you how to build the Millennium Falcon with all the little pieces that you got. That's really what you're talking about when you're looking at that fire implementation guide. So next slide. So the call to action here is that, yes, these things have been built by standards bodies. They've been asked to be part of different products, but the imaging resource or the imaging study resource itself is currently not supported in many of the major EMRs or EHR vendors. So having that particular resource defined in fire is great. We know how to build that resource and use it, but until it's actually part of that EMR, EHR, it can't be used. Also, you need the PACS vendors, or as the FDA now calls it, the MIPS vendors, or your VNA vendor to populate that imaging study resource to the EHR. They are the source of truth for that imaging study, and they need to be able to be responsible for posting that initial resource into the EHR for consumption. And on the DICOM side, I know this is a fire talk, but still the use of DICOM web, it's in many of the VNA products, but it's usually used internally and not exposed to the sites themselves. They limit a lot of the functionality to DIMC because it's very simple. So next slide. Really the call to action here from the societies is we need to work on fire implementation guides for these registries, for the quality measurements. We need to work together so that we can build something that makes it easier for sites to implement this work, that they have this kind of script that will tell them all of the things that need to be done and how that data is gonna be extracted from their systems to fill out these registries, to report CMS quality measures so that it's already been scripted out and they can then go to their EHR vendor and say, look, I want to implement this implementation guide through fire because I have three FTEs hand entering data for this, and I don't want to do it anymore. So next, just hit next on the slide. Really, this is what we've started to do with ACR Connect. So we see ACR Connect as reading this fire implementation guide and talking back to the PACS and VNA systems and putting this data together and pushing it off to our multiple registries and being able to use it for any types of registries in the future. But this is kind of, it can be done by ACR, it can be done by anybody as long as we build that script as a society and these registry providers so that they know what they need to be doing and can push the vendors to get this done. And my last slide is really one more. Okay, I really just want to say that this is not something that hasn't been undertaken anywhere else. M Code has been feverishly working on this for cancer reporting. So they have put together a very collaborative international group that has been working on being able to code out and gather resources. There is a fire implementation guide for M Code coding that has been put out for public comment a couple of times now and is definitely something that can be done. Again, I would like to say that the technology is there, the resources have been built by the standards bodies, but again, the implementation details and the political hurdles still remain today. So I'd like to thank everybody for their time and I will turn it back over. Great, thank you so much, Brian and Chris, that was really helpful. Anisha, if you could join us back on if you're still with us, I know you've been feverishly typing comment, answering everyone's questions left and right. I just want to say this was wonderful to hear sort of what the potential is. And we heard a lot of that of what Marty could describe that MD EpiNet has been able to pull together, still recognizing there are some barriers. I loved what Chris and Brian were able to share with us. And I think it very much goes back to what you said at the start, Anisha, that this was, you know, so much of this is not technical. It's, I think, as Brian said, it's implementation and politics. So I'd love your perspective, maybe just to start Anisha, what is that logical path forward? You know, really building on the call to action that Brian put up there for us, what could we collectively do, especially societies, to really move beyond the implementation barriers and the politics and to say nothing of the vendor space and the money involved that is just real? Yeah, look, I think in the crawl, walk, run of life, the crawl is the obvious, which is to get Brian and Chris, I'd love to help them, you know, find someone that's live on the Cures Act regulated version of these EHRs, not this kind of hybrid that was sort of free to the consumer and then paywalled and weird for the, you know, registries. Let's go to the Cures Act versions that there's no more excuses. Like it's just, you know, does that system connect to ACR Connect? So obvious thing to do is to get two, three, or four places to download the free ACR Connect software. And I assume Chris and Brian, every radiologist basically has, I don't know what the universe is, but every site that's live is, you know, you have a system that you could interact, you give a reason for interacting. So crawl is, let's take the same tech in the right governance model and test it. And Helen, you should write about it. Like, this is fake news and this is this. It might even threaten, like, did you realize this is, you can't add this burden. And I'm very happy to join in writing a post about it or whatever. Media, and this maybe is a little bit more on the Marty side, I do think that, and this is the reason why HHS announced a new HHS-wide policy, because you've got a left-hand, right-hand problem. So I feel like FDA, NIH, CDC, CMS are somewhat familiar with quote-unquote FHIR, but don't really know this regulated information blocking version and what it specifically can do. Like Marty, I was reading up about all this women's health CRN. Oh my God, it's hundreds of pages of technical complexity, way crazy stuff. I mean, I'm sure really important over the last five years, you didn't really have a choice, but if you were to sort of redo the whole thing today, it'd be a skinny little smart on FHIR app and we could test it. So I would dust off that women's health FHIR app and demonstrate, because Helen, I think the other agencies could in this medium bucket be an influencer as the third voice in the room. So it's not ACR plus health system arguing with an EHR vendor and an IT department, agency of interest in the room, not ONC, but the, you know, so that's medium. And then obviously I think, you know, Brian and Chris laid out a beautiful roadmap for new implementation guides that are really custom built for the use cases of the registries. And that's just getting multi-stakeholder consensus as to how this should work. That's why I suggested the TEFCA FHIR pilot because we would just do all that in the pilot and then everybody would follow the same approach and we would test it, validate it and scale it so anyone else could benefit. So this is a lot more near-term attainable. Nothing I've heard today implies that those things need to wait three to five years. Those are all actions we can take by the end of this calendar year. Yeah, I think especially that there is political will and there is something clearly happening in terms of the, you know, the first on FHIR initiative within federal government I think is a huge opportunity here. And you know, Anish, you've been talking about the work we're doing as well, you know, for CDC of getting data across 50 or 60 health systems on immunization. So I think there is a great opportunity here for us to think about. I love the idea of us thinking through a registry-specific implementation guide that maybe CMSS could put together with your help bringing multiple societies who are sort of at that point and ready to work on this as well. And let's see, I think about four minutes left. Kathleen, you've asked a couple of questions. Do you wanna just answer your question, ask your question live? Otherwise, I'll just read it. Is Kathleen able to speak? Otherwise, I'll just go ahead. Kathleen Hewitt leads the research initiatives over the American Society of Hematology. And she said they've been working with IQVIA. And for example, they've been piloting EHR API data submission for their registry. Their pilot sites are too epic, too Cerner's, delays of course with some of them. First site making good progress and they're hopeful that the pilot will actually significantly demonstrate less resources to integrate with the HR data. So we are starting to see some examples of how some of this moves forward. But Helen, here's my worry. Kathleen, please listen to this. If you end up in the unregulated local, kind of jerry-rigging world, it doesn't scale. So you'll have this nice story and progress but it doesn't go to the next site and scale. You wanna be really specific. Cerner won't even ship this until Q1 of next year. So you already know that those are not doing it. So you've got to find the Cures Act live. This is the epic August, 2021 edition. Find the health systems that are live on that edition and then start over with the connectivity because it's a different animal than going into a site that doesn't have those features where the interface team still, whether the interface team talks HL7, V2 or FHIR, it's still an interface team. There's no efficiencies gained from piloting somewhere that doesn't have the right tools to pilot. Yeah, that's really interesting point. Maybe we can get that list for people and share it. That's not public. You gotta ask health systems. You just gotta ask them, are you on the August, 2021 edition of Epic? And I think, I don't know if it's been public, but Cerner is emailing some kind of deadline timeline, which I thought was Q1, could be end of this calendar year, but it'll be, it's not now. Okay, well, it's certainly a good place to start, at least by asking about Epic for that version. And it just sounds like, I mean, we only have a couple of minutes left, but I think just wanna thank everyone, certainly for giving us lots of food for thought. I think there's a real opportunity for us. I'd love to be a fly on the wall of the Anish ACR conversations to follow because I think it might be a really nice example of what we could build beyond ACR as well. Because I think, while a lot of their issues are with the imaging, it can't be that different than what most of the rest of us are trying to do with others versus- No, no, no. Yeah, they're querying for the reason. I don't know exactly if the reason for exam is in, Chris and Brian, is that in U.S. CDI or is that would be like a separate data element that's sort of unregulated? Oh, that's already in it. That's only reason why we use that as a simple example that it's very clean cut. It's there, it's across the board. Yeah, Helen, don't just be a fly on the wall. Sit in the room. Let's do it together. Let's document. Yeah, I think it's great. And let's tell everybody how it works. I think it's great. There's also a comment from Leon Rosenblatt who works with lots of our societies through IQVIA, making the point that even beyond the collection of EMR data for secondary use, registries also provide a process for data curation that really helps to transform some of those raw observations into scientific variables. And we're gonna have to really think about how, what's the vision for getting that more broad-based interoperability that supports the disease-specific semantics as we've been calling it sort of the lexicon for longitudinal measurement across our societies. Your direct question to Chris, right? Do you collect indication for study is exactly the kind of things I think we need to get specialty by specialty of what's most important for research outcomes, delivery, et cetera. Well, lots- Helen, one of the things that we're doing in MDF, it's exactly just that, is defining that minimum core data set. Well, it sounds like we need to follow up on that too, because we don't want to reinvent that wheel as we often hear from our friend, Andy, there's no time for reinventing. But Helen, remember, as we end this, the current timeline to add new data elements to the U.S. CDI is in years, not weeks and months, with one exception. Other federal agencies that have priorities can submit those elements as U.S. CDI plus projects. So what you'd want to do is lean heavily on Marty to be officially recognized as those extra data elements have to go through an FDA sponsored U.S. CDI plus. So the EHR will know that while this isn't broad-based required, it's required for FDA. And to my knowledge, there's not been a single agency U.S. CDI plus project since the announcement of the program began in the spring of this past year. So it's empty. That list is empty. Let's get on that list yesterday. Yeah, there's some potential around the adverse event surveillance that's on U.S. CDI plus's kind of radar. But I do think that needs to be expanded greatly what we do in that space. All right. Well, I don't want to hold everybody up. Thank you so, so much. This has been phenomenal. We will share the recording. We will share the slides. I definitely need to watch it again. I learned a lot. And we'll all be in touch because I think the purpose of this is to sort of share our knowledge, but really more importantly, what can we do together? Because time is of the essence. These rules are going to come at us really soon and we can't miss what's potentially a really important window to act. So with that, thank you all so much. Thank you, Heidi, for pulling this together. Thanks to the CMSS staff for making it work and we'll get everything out to you for your perusal. All right. Thanks everybody. Take care. Bye.
Video Summary
The second webinar in the CMSS Registry Science and Research Webinar Series focused on the implementation and benefits of FHIR (Fast Healthcare Interoperability Resources) for advancing clinical registries and research. The session was coordinated with the help of CMSS staff and Heidi Basley, building on the first webinar from July 25th which covered USCDI.<br /><br />The webinar featured presentations from notable figures like Anish Chopra, President of CareJourney and former CTO for the Department of Health and Human Services, who emphasized the policy and practical aspects of FHIR technology. Chopra discussed Cures Act regulations going live this year and the current readiness of certified EHRs, emphasizing that only about 5% of these systems had updated their capabilities, though this covers much of the market.<br /><br />Marty Velaziz from the FDA's Coordinated Research Network talked about leveraging real-world data and improving device identification and data collection efficiency using FHIR applications in various subspecialties within MD EpiNet. This aimed to reduce the repetitive nature of data entry and enhance real-time transmission and storage of medical data.<br /><br />Representatives from the American College of Radiology, Chris Treml and Brian Bielecki, shared insights from their experience and pilot projects integrating FHIR with registries using their ACR Connect program. They highlighted both the technical feasibility and the real-world challenges, such as organizational overhead and vendor restrictions.<br /><br />The overarching message was the need for specialty societies to collaboratively develop implementation guides specific to their registry needs and to leverage government incentives and regulations to make FHIR a practical reality. The session emphasized the importance of action, collaboration, and addressing both technological and political barriers to realize the full potential of FHIR for clinical data interoperability.
Keywords
CMSS
FHIR
clinical registries
research
Anish Chopra
Cures Act
EHRs
real-world data
MD EpiNet
American College of Radiology
ACR Connect
data interoperability
×
Please select your language
1
English