Escape Data Paralysis and Uncover Lifesaving Data
New advancements in virtual medicine have made it easier for patients and providers alike. Now a new wave of innovation is being fueled by actionable insights pulled from the vast amount of patient data that already exists. Videra Health is an innovator in this space, improving patient treatment by using AI and remote monitoring to prioritize safety and care for those who need it most. Join Michael Thurston from Xantie, a Veracity Partner, and Videra Health COO and Co-Founder Dr. Brett Talbot for a webinar to learn how Videra is helping organizations escape data paralysis and uncover life-saving data.A transcript of the webinar is available below.
Full Webinar Transcript
We use Rev.com for all of our transcription needs.
Katie Frank (03:13):
Good morning, good afternoon, and good evening, everyone. Thank you so much for joining us today. My name is Katie Frank and I am the Content Marketing Manager for Veracity Solutions. Veracity Solutions helps enterprises, especially those in highly regulated industries with custom application development and adopting cloud-native technologies as a platform for innovation. We plan, build, and launch successful software solutions from conception through adoption. Our expert advice, coaching, and skilled developers can help your organization identify and achieve critical business goals and lift your team in the process.
Katie Frank (05:53):
And we have two very exciting speakers today. The first one I'll introduce is Dr. Brett Talbot. Brett is a licensed psychologist and the Chief Operating Officer and Co-founder at Videra Health, a digital health technology company focused on making healthcare scalable. Prior, Brett was chief clinical officer and executive clinical director and a senior director of research and quality for residential group practice and healthcare organizations. Brett is also a faculty member at Utah Valley University, and he's fun and easy to talk to. He's an avid outdoorsman. He enjoys hiking, camping, backpacking, and fly fishing, and anything that gets him out into nature.
Katie Frank (06:38):
And our first speaker today is, Michael Thurston. Michael is a firm believer that data will make good organizations great. And for the last decade, Michael has been trying to transform the healthcare industry with data best practices to improve patient outcomes, reduce costs, increase revenues and profits. He advocates the use of data to improve processes and drive the decision-making process. His skills allow him to speak to all leaders of an organization from the C-suite as well to the frontline employees. In 2016, Michael started Xantie to help organizations with their data and analytics needs. Xantie has worked with organizations from almost every industry, including healthcare, manufacturing, supply chain management, pharmaceuticals, construction, and digital marketing. And with that, I will turn it over to Michael.
Michael Thurston (07:32):
Awesome. Thank you, Katie. And thank you everyone for joining us this afternoon. Super excited to present on this subject, super excited to be here with Dr. Talbot and walk through a few use cases, showcase some tools and highlight different opportunities that we can use to escape data paralysis and hopefully ultimately save lives. So from an overview perspective, we talk a lot about big data and I haven't ever come across an organization or a company that doesn't feel like their data is big. Everyone thinks they have big data because it's a loose term and I'm okay with that. I'm okay that everyone believes that they have big data because they do. They have big data challenges and using that data to help them solve problems can be difficult.
Michael Thurston (08:34):
I was speaking with one organization last week where the director highlighted the issue almost perfectly in that data isn't his whole job. His job is to make business decisions and to help people. And so going and analyzing data and analyzing different reports all day, every day isn't something that he can actually do. And so in order to leverage data to help him solve problems, there's a few techniques that we're going to talk about today that ultimately don't really cost a lot of money but can provide a ton of value to an organization.
Michael Thurston (09:21):
So let's talk about the agenda. We are going to define what is data paralysis. We're going to talk about a tool that we call closed-loop analytics, and then we're going to look at some of these tools. Videra Health actually highlights these opportunities and these tools really, really well. So we're going to walk through a few use cases there. And then we can also talk about different outcomes, how leveraging alerting within the data to actually show how that has saved lives and have helped organizations improve their outcomes. And then we'll do a quick summary, recap what we've learned, and then we'll do a quick Q&A.
Michael Thurston (10:09):
Many of you are probably familiar with this diagram. It's very generally used, right? You have your source systems over here on the left, you have ETL process which then pulls the data into a data warehouse. And then you have all of these different tools for reporting, for analytics, and for data mining to ultimately get the data into the hands of the business leader to make good informed decisions.
Michael Thurston (10:39):
Now, what I want to highlight here is that you can see there's a lot of different systems, even they're not systems sometimes, sometimes they're just flat files, right? But there's a lot of data that gets put in front of business leaders that it becomes very difficult for them to just analyze them, even if we're using dashboards or different reports going through and sitting there and analyzing and agonizing over, okay, how do I filter this? How do I get to what's meaningful? It can cause what we call data paralysis. There's so much data that you freeze up and you can't make a decision because you feel like, well, I just got to look at one more thing. I need to have one more piece of data before I can make a decision.
Michael Thurston (11:24):
The other thing that I want to highlight is closed-loop analytics. So how do we escape from that data paralysis and provide simple data solutions to decision-makers at the point of decision making? So really the only difference between this slide and the last slide is that we're taking that data, we're packaging it really well, and we're bringing it back into the source system and we're putting it there so that the data can be at the point of decision making. That's the critical piece, is if you're able to provide the right data at the right time, that's where you're really providing value to the organization and to the decision-maker because then they have the data when they need it to make that decision. And that sounds simple, it can be complex, but Brett's going to walk through some of those examples today of getting the data to the right people at the right time to make the right decisions. And so I'm super excited to turn the time over to Brett so that he can highlight these tools and ultimately show us what we can do to escape this data paralysis.
Dr. Brett Talbot (12:41):
Yeah. Thanks, Mike. And yeah, equally excited to be here and talk about this subject. It's something as a practitioner, as a clinical psychologist by training I experienced in my own practice with my own patients where I either had a lot of data about a patient and didn't have enough time to analyze it, to make treatment decisions, or I didn't have enough data and I needed some of that data at that point of decision making. And so part of that practice, part of my practice was realizing that that closed loop and that data paralysis was happening more often than I realized across a lot of practitioners at all levels of healthcare delivery. And so at the time I started getting into technology and really realized that technology and these types of analytics and tools can help us do that. And that's why I created and co-founded Videra Health. And the main piece around that is that we use artificial intelligence, machine learning, and automation to address that data paralysis.
Dr. Brett Talbot (14:01):
The problem to highlight to really understand alerting and really understand where to get those analytics at the point of decision making, you have to understand the larger problem which is that our current approaches aren't scalable. We have a lot of people to see, a lot of data about them, and not enough people to see the patients and to analyze the data in order to decide how we do that. So what ends up happening because of all these limitations is we end up tracking vital signs. We determine here are the three to five things that I have to know in order to treat this patient or to make that decision. And what happens is we actually lose about 80% to 90% of what's actually going on in the healthcare setting with a patient because of all these limitations, because in order to get that we have to be face to face, or I don't have enough time to analyze it.
Dr. Brett Talbot (14:58):
So we end up just getting of all of that data we could have, or do have, we really only look at three to five things and we base all of our decisions on those. But in my conversations, in my practice, and as I go about talking about this, I really realize, or we realize that if people have more data and instead of just tracking those five things, but we were tracking 100 things, that big data, but a system allowed us to understand which five I should be looking at. So there's that difference between track five and look at five versus track 100 and look at the five most important. We really started getting out of that data paralysis and into decision making and then saving lives. But because of that paralysis, we end up just looking at a few specific things.
Dr. Brett Talbot (16:00):
And part of that is because there's this big black box that exists in healthcare, all the data that we're getting even though it is a lot of data, it comes primarily from face-to-face interactions. And that requires somebody to be there gathering some of that data, or we can pull from EHRs and other things but patient-generated data, provider generated data, but we don't actually have a lot of information about what's going on when they're not in front of us when the patient isn't actively engaged in an interaction with us. And that actually allows us to, in a sense, get more data. Now, I know we're talking about data paralysis and how to talk about and how to analyze and get through and model through all of this data. And so it's somewhat counterintuitive, but our approach is actually more data helps us get unparalyzed from that data by using alerting, which we'll talk about.
Dr. Brett Talbot (17:05):
So we actually need more information to know which information is most important. So we'd spend a lot of time gathering information around this black box when they're not in front of us in order to determine what is most important. And Mike, feel free to jump in at any time here. I know we've discussed these topics at length as well.
Michael Thurston (17:29):
Dr. Brett Talbot (17:33):
If that's the problem, if this problem is we're paralyzed with data, we need to model through it, our approach is get more data beyond those vital signs so you know what's important. We need a way to do that. We need a consistent and scalable way. And we've talked a lot about these vital signs. And if you think about, let's say diabetes management, or a cardiac care patient, we have these peripheral devices, Bluetooth enabled glucometers or Wi-Fi enabled blood pressure cuff that's giving this constant feed of data about how the patient is doing, and when it comes to other non-vital sign conditions that are chronic in nature, but the symptoms are more observable, we don't have a blood pressure cuff for depression or a glucometer for anxiety or emotion-based or mood-based conditions. And so what we found in this video... Oh, go ahead, Mike.
Michael Thurston (18:36):
Brett, there's no blood pressure cuff for depression, but what were some traditional ways of gathering information, right? Because you said before sometimes you just didn't have enough information, right?
Dr. Brett Talbot (18:54):
Michael Thurston (18:55):
What are ways that it was gathered before?
Dr. Brett Talbot (18:58):
Yeah. Good question. And so typically what we see is we give out questionnaires screeners, multiple-choice, paper and pencil. And even if we've digitized those paper and pencil surveys or questionnaires or screening tools, it still took time to score and analyze and graph and we still only got a vital sign. We got a specific condition that was being measured by that instrument or by that tool.
Dr. Brett Talbot (19:29):
If we wanted information about a myriad of things, we would have to give a survey or a questionnaire that ranges from 10 to 60 questions about each condition, and it became not scalable. It became too much data. And that's when we got paralyzed in that way is that we got survey fatigue, providers got results fatigue, didn't know how to go through it. And we really realized that we're actually missing 70% of the boat by just giving surveys and we're missing everything else that makes that patient a person, their mood, their affect, their sentiment, how they exist when they're not taking that survey and how their emotions that they exhibit throughout a week or throughout day and during that black box is where we find more value than those traditional methods.
Michael Thurston (20:25):
Dr. Brett Talbot (20:27):
And I think people have realized that, we know that we just had no way to scale it, right? We had no way to get beyond the traditional assessments. And so we need a tool that's consistent and scalable, something that we can get at that black box, we can get beyond those vital signs in a way that doesn't create survey fatigue for the patient. So it's easy to take, it doesn't disrupt their life, they don't have to get a babysitter drive down to the clinic, sit in a lobby, meet with a practitioner, get some vital signs, then finally get to see the main provider. So an easy way for them to interact that tracks and documents these types of data and analyze them in real-time running in the background. So we needed a tool to do that so we set out to build that.
Dr. Brett Talbot (21:18):
So we've created a clinically sophisticated automation, a tool that tracks these things in the background, connecting with the patient in real life outside of the office visit and analyzes those and alerts. And we'll talk about the alerts in a minute, but I wanted to just talk about that idea really quickly of a lot of the paralysis often is look, I just don't have time to gather and analyze and do this even if I give that to a staff member, it's just having to do that versus doing more direct care. We really want people to be able to practice at the top of their license, do what they do best which is treating the acute conditions that only a human can go in and address and treat and try to automate and use AI and alerting to track all of the other clinically related and session related information and which allows more time for you to do treatment.
Michael Thurston (22:30):
Brett, you said, you can only capture somebody's mood when they're taking the surveys before, does this allow for more frequent follow-ups and more touchpoints?
Dr. Brett Talbot (22:46):
Yeah. Absolutely. The idea is to meet them where they're at, right? Instead of finding a time that we have to be synchronous, that we have to be face-to-face. And even if we send out a survey again, that's just vital signs. So we have a way, and we'll talk about this in a second, a way to send a text and they click on a text and they're able to submit a video response to any prompt. And our average check-in right now is only about two minutes, but we do that more frequently. So we have outpatient psychiatrists who usually only see patients every three months but enroll them in a Videra Health workflow and we're checking me in twice a week or once a week, or if there's a new med change or a dosage change, we check in more often. And all of a sudden, we have all these clinical data points about what's going on with them outside of our office. And again, that creates a big data problem, but then we use AI to extrapolate what's most important out of that in real-time.
Michael Thurston (23:56):
Dr. Brett Talbot (23:57):
That's where we get to the star of the show so to speak, is alerting. And part of alerting is we can take surveys and we say, if they score above this, alert us. And that's all great, but again, we're really missing everything else that makes that patient a full person and this whole holistic approach. And so in addition to those traditional methods, having an opportunity for patients to check in anytime, anywhere without scheduling a visit and without having to be face to face with a provider and give us raw data from a peripheral device that's not vital sign centric, right? It's not just taking heart rate, it's actually, you get on a video and we can track a lot of things off of the video that become available on a dashboard and so you can start to create baselines for people.
Dr. Brett Talbot (24:59):
So if I know what Susan looks like in sound like when she's stressed or in a manic phase, but she's been taking some Videra sessions, we also know what she looks like and sounds like right before her manic phase. And she's checking in, in the background creating this database that's personalized for her, and the AI and machine learning is learning what's normal for her, what's baseline before we can start to alert when that starts to deviate when she starts to decompensate.
Dr. Brett Talbot (25:25):
And we do that by analyzing the video. This is again, the alerts coming out of the video. So for example, we can do computer rating scores which is a more non-threatening way to say artificial intelligence where we can say, hey, look, we know what to look for and listen for when we're face to face, we just can't be there with everyone all the time. So we can train a computer, which is really good at tracking 100 things time where humans can track three to five things at a time and alert us to changes in language cognition, emotions, rate of speech, things of this sort in addition to standardized scores as well as human ratings and informant reports.
Michael Thurston (26:14):
Brett, how did you know, all right... There's a few questions here and it's around the process, right? But how do you know, all right, I need to set up an alert for this, or how do you say, okay, this is outside of the normal. The other question I have is, how did you know, okay, here's what I normally look for and this is just vitals. Here are the other data elements that are now significant as we are monitoring patients. There are few questions there. Sorry.
Dr. Brett Talbot (26:50):
Yeah. Good question. I think I heard two main ones. I'll try to address those if I'm off you can correct me. The first part is how do we know what to look for and listen for? Well, in healthcare, it's fairly prescribed, right? In business, in human resources, every interviewer has a different set of criteria, what they want to ask an interviewee that makes them a good candidate for a job, very, very subjective. In healthcare, we get a little bit more prescriptive. We have a little bit more visibility into clinical criteria. So we know what we're looking for, but even with that said, there's differences in how providers go get that information, interpret that information, and analyze that information. So this actually is a way to make that more consistent and take the subjectivity out of something like emotion and how we track emotion and make it more objective. So we can take what we know we should be looking for and ask a computer or ask analytics to look at that.
Dr. Brett Talbot (28:06):
The second part is, how do we know when we should alert or when that threshold when we should do that. So we know on a standardized measure if they score above a 20 on a PHQ-9 score then they're exhibiting a depressive symptom, so we know that. But what we've done is we've taken some of that data and have done cross-validation with video so we can now detect what somebody's PHQ-9 score would be just based on 30 seconds of video. And so instead of having to give five different surveys for five different conditions, we can take one 30 second video, run that through five different algorithms looking for each condition at a time. So all of a sudden it becomes more scalable and reasonable for patients.
Dr. Brett Talbot (28:58):
I'll also highlight that sometimes that threshold isn't so much how do you compare against the rest of the world, it's how do you compare against you? This is personalized healthcare at scale. This is, is it different for you? Mike Thurston always talks out about 120 words per minute. That's his baseline. Now, if he goes up to 140, we may need to know about that. Now it could mean that he won the lottery and he is super excited, or maybe he's in crisis. And so sometimes the alerts are more about, what's different for an individual that how do they compare against any clinical criteria? Now, both are valuable and so we use both.
Michael Thurston (29:47):
Okay. That's awesome.
Dr. Brett Talbot (29:48):
Yeah. One of our biggest things that we find that help providers get out of data paralysis is language analytics. When somebody oftentimes, especially in behavioral health and mental health we're talking a lot and we're trying to extrapolate out of everything that's being said, what are the themes? What are the things I should be thinking about? And that takes a lot of time, takes a lot of time to have those types of interactions and we usually do them face to face. So by having 30 to 120-second check-ins and having analytics really pull out, we transcribe everything that's said in a video submission and we scrub that looking for certain key phrases. So we can pick up on things that if somebody's saying something suicidal like, "I'm super depressed, I feel like I can't do it anymore" or things like, "I've lost my job." Certain themes that can get picked up, we can bring the surface flows to the top and say, these are the people who are in crisis who are struggling. And you didn't necessarily have to go play phone tag with all of them to find that out.
Dr. Brett Talbot (31:00):
Some of the current metrics that really AI can be powerful to get us out of that data paralysis and really get that closed-loop, get the data input analyzed and back into the hands of either the patient or the provider to say, this is where I'm at, and this is what I need to address. We have things, what we call signals. So think about it as a symptom of a condition, signals are symptoms of an algorithm. We talk about absolutist language, always, never, which are indicative of somebody usually in some sort of crisis. Word complexity which is a derivative of cognition. If I usually talk at this grade level and all of a sudden I dip, maybe that's a side effect of a medication, emotional distress, rate of speech, none of these necessarily mean that somebody is or is not progressing or is not decompensating, but it starts to get you some surfacing these signals out of a large data set.
Dr. Brett Talbot (32:05):
We have algorithms, I mentioned the PHQ-9 and GAD-7 where we can predict those scores accurately based off of just video. And so now you have quantitative and qualitative as well. And when those things get alerted on the system, those can go real-time to providers or available on a platform. So if you think about a caseload of a 100 people, I'm treating a 100 people and I'm at a hospital, let's say, and I've discharged a 100 people, if I wanted to know how they were all doing in the seven days post-discharge, I'd have to call them all and find out how they're doing. And most of them aren't going to answer because we have to be live. But by doing this in the background using artificial intelligence and automation to get out of that paralysis surface real-time alerts, all of a sudden I can log in on a Monday morning or seven days later, and I can see all 100 patients have checked in or say 80 to 90 of them, but I know the 10 that I need to reach out immediately to.
Dr. Brett Talbot (33:10):
So things that we're building over time are stress in the voice, abnormal movements, and twitches, slurring of words, other key phrases that are condition-specific more around cognition. We're working on a tardive dyskinesia model where we can detect side effects of medications, things of that sort just off of video.
Michael Thurston (33:37):
Brett, some of these data points that you're calling out here, are those consistent with what you would normally be looking for in a situation where you're actually sitting down with a patient like I'm thinking of rate of speech, right? You're like, well, that's not 110 words per minute, that's 111, I need to have a conversation here. Is this above and beyond what you would normally get just from a regular visit?
Dr. Brett Talbot (34:06):
Yeah. It's above and beyond in two ways. I'm glad you asked that. The first one is that as humans, we cannot pay attention to 100 signals at once so that's why we talk about vital signs. So if I'm meeting with somebody, I'm thinking about three to five things that I'm looking for and listening for, so I'm missing 95 other things that are equally relevant, I just don't have the capacity in the moment to do it and so a tool like this can help with that. The other way that it's above and beyond is, we can start tracking things that we as humans are incapable of tracking. Rate of speech is a good example that you brought up is like, I don't know a human that can necessarily listen to somebody talk and then say, this is how fast you're talking and then be able to compare that to the last six months of interventions and see how that's different.
Dr. Brett Talbot (35:01):
We don't compute like that but computers and AI are really good at remembering the things that we struggle as humans to remember and so this can really empower us to have better data, more relevant data, data that we would've otherwise not had like this slide that I just brought up and that really provides for better data, better treatment planning and better outcomes because we have all of this data, but still in bite-sized ways because alerts have surfaced that in order to do that. So thank you Mike for bringing that up. I think we've talked about these quick and more frequent touch points actually bring more value than episodic snapshot data analytics where you only have a one moment snapshot every three months of a patient where that allows us to make data more actionable because it's in real time when it's happening and more often.
Dr. Brett Talbot (36:15):
In the last few minutes before we get to some summary and some Q&A, I just wanted to highlight a few use cases. This is a hospital that we were recently working with doing about 10,000 sessions per month and I remember talking like how, or check ins, so to speak, how much time and effort would that take. It's an entire call center of nurses trying to do that. So we saved 86% of their time. They have 1500 patients enrolled at any current moment and we're able to surface 11% more critically events. Now that doesn't mean that we've only been able to detect 11 things, it means that there were 10,000 data points and AI and video were able to extrapolate the 11% that are critical and that you need to pay attention to and that really gets us out of that large ocean of data and really surfacing what needs to be saved.
Dr. Brett Talbot (37:23):
And just really quickly, a little bit more individual based, just wanted to share a customer story that happened recently where a patient left an addiction treatment center. And if you know anything about addiction, the first 30, 60, 90 days after a rehab stay patients are most likely to relapse. But the clinic had very little visibility into how these patients were doing afterwards. So they had enrolled a patient on a tool like this on a Videra Health session a patient was checking in twice a week for the first 30 days, once a week for 30 additional days and the system alerted this person was starting to use quite a bit more absolutist language than was normal for him.
Dr. Brett Talbot (38:17):
And so that prompted somebody to be able to reach out and say, hey, how are things going? Things are going well. Well, we picked up on this, tell us a little bit more about what's going on and to make a long story short, what the provider and the patient were able to realize is that was the start of his relapse cycle, that he started to get more concrete thinking, thinking in more absolutes, this always happens, this never happens. And as they looked back, that was his start of his relapse cycle.
Dr. Brett Talbot (38:50):
They were able to intervene, get him back into some treatment and his family talked about that this was a different cycle than they had ever seen. He had been through the cycle, through the revolving door, so to speak so many times, and this is the first time that we were able to be proactive, proactively find and notice what was going on with him before he relapsed. And while it may seem like such a small thing, but just that small little trigger, that small little alert about absolutist language, his family attributes to saving his life. He's been sober since and again, disrupting that, being able to be more proactive than just reactive waiting until bad things happen in order to intervene and with tools like this, AI alerting and getting out of that data paralysis we're able to really save people's lives. I know we wanted... A couple quotes here from people we can skip through that and make sure we have some time for some questions and some answers.
Michael Thurston (39:58):
Okay. Awesome. Something that keeps going through my mind as we've gone through this, and thank you so much, Brett, is we build a lot of reports. We build a lot of dashboards. Somebody has to be looking at those dashboards in order for them to get insight out of it, right? And so that's where closed loop analytics become super important, but also we're talking a lot about alerting today and alerting can be the most simple tool, right? Because as Brett was talking about, you can set certain thresholds and you can say, all right, if our daily sales volume drops below a certain number, I need to be alerted of that and I need to know. And so all is it's either a simple text message or an email notification that goes out to one of the key stakeholders.
Michael Thurston (40:55):
There's lots of key indicators that we build into these dashboards. But if we can set up an alert on those so that it's a push notification to the decision maker, that makes all the different, right? Because that's where the rubber starts to meet the road in that they're going to know, oh, something's going on. I need to do something, right?
Michael Thurston (41:17):
The other thing that is great is that a lot of times we do know what those key indicators are. Using machine learning and artificial intelligence, we can begin to do analysis in ways that we weren't able to in the past, right? We're able to leverage a lot more data and look for correlation and causality within the data and trends that will then indicate to us, all right, if we have all, all of this data, what is the ultimate outcome? Let's use a simple example of sales. We need our sales number to hit our target, right? And here's all this data that we can then use to say we are going to hit it or we're not going to hit it, right?
Michael Thurston (42:07):
We can leverage machine learning to help us to help us to understand what we should be alerting on. Oftentimes, we'll take HR for an example, turnover is a lagging indicator, right? We don't know about turnover until it has actually happened, right? And so then we need to start to analyze, what might cause turnover, right? Or what might be a leading indicator to indicate to us that someone is going to lead the organization? And so that's where artificial intelligence, that's where machine learning can be extremely helpful is we can begin to look for insights that we might not have thought of before, as well as we can find insights that there's no way could have thought of, right?
Michael Thurston (42:52):
Going back to Brett's example of calculating somebody's speech pattern how fast they're speaking, we wouldn't be able to do that on our own with our finite minds, right? And so we're able to leverage the computing power of machine learning and really harness that we can begin to say, okay, this is a leading indicator. If it's a leading indicator, I need to be alerted on it, but I only need to be alerted on this threshold. What we've learned is that there's some great tools that we have at our fingertips, as well as we already have specific rules within our own individual organizations that we could then use for alerting.
Michael Thurston (43:36):
This is a great tool that I often see goes underutilized just because it is so simple. People are always worried about being over alerted, but if you continually manage the process and stay on it, have a quarterly review, or maybe a semi-annual review of your alerting process, that keeps you up-to-date on what alerts are going out and maybe priorities change, maybe things shift. And that gives you a better idea of what you should be alerting on.
Michael Thurston (44:09):
Brett, thank you for your examples, showing us alerting, some great closed loop analytics, when we walk through your tool the first time and seeing those red markers up there that a provider can automatically know, okay, this is something I need to be aware of. Oh, this is an area that needs my attention. It wasn't that they had to go look for that. It was automatically surfaced for them. What an awesome tool? Thank you so much for that. I think now we can turn our time over for any questions that we might have.
Katie Frank (44:48):
Yeah. Okay. Michael, if you'll flip to the just the next slide or Brett, I forget whoever was controlling, we do have a few questions already submitted. I'll read the first one now. As a provider, what have you learned that you would not have known by using this tool? I'm going to say that one's for Brett.
Dr. Brett Talbot (45:10):
Yeah. As a provider... Say it one more time. As a provider?
Katie Frank (45:16):
As a provider, what have you learned that you would not have known by using this tool?
Dr. Brett Talbot (45:20):
Okay. What have I learned that I wouldn't have by using this tool. Yeah, it's a good question. I think Mike was just talking about these leading and lagging indicators. And one thing that came to my mind which I think is relevant here is oftentimes we think of data as we have to get a data set and then analyze that data set. And I think we do that on a population health management, and in many ways that's really well. What I've learned as a provider is that I am much more effective when I have meaningful data about my patients that I don't have to be analyzing in the moment. When I'm face to face I'm having to elicit information from my patient, get that input, analyze it, respond, and make a decision all at once. And what I've learned is that I was less effective that way. I'm more effective when I've been able to have that information at my fingertips prior to a face to face interaction. And now I'm empowered with information, with analytics, with insight that I would've otherwise never had.
Dr. Brett Talbot (46:37):
I would've not only never had information about what they were doing at all when they were not in front of me unless I spent the first half of my session saying, how have things been? Okay. How are the last 30 days? They've been good. Well, what does good mean? And it was somewhat of a waste of time of 30% to 40% of my session or my interaction just trying to get data input and not moving forward. Now I have information where I can move forward more quickly right into meaningful treatment. So again, just in recap I've learned that I'm less efficient trying to do it all at once, a better clinician and better patient outcomes when I have this type of information at my fingertips to make better decisions.
Katie Frank (47:28):
That checks out having context for data that you don't have to place on it yourself is incredible and helps give it more meaning. This next question actually flows into it. They ask, could this tool also be adapted and useful for inpatient facilities treating cognitive and developmental disabilities or is it strictly designed for behavioral health?
Dr. Brett Talbot (47:54):
Yeah. It's a good question. Let's define tool. One is there's the method of data gathering and then there's the tool of the signals, the method of the data gathering, particularly for what we believe is the next evolution of digital health and the next evolution of healthcare interactions is asynchronous video, not having to be in front. Video can be extrapolated for a myriad across all of healthcare, things that we would normally want to look for, listen for in any setting, inpatient, outpatient, partial hospitalization, we are used to having interactions and seeking to get something out of that interaction. Now we can extrapolate that to multiple clinical touch points, multiple patient experience touch points. That's definitely applicable across.
Dr. Brett Talbot (48:45):
Then we get to the tool of the signal. So then we talk about developmental disorders, cognitive disorders. Again, we know from literature, from traditional assessments what we should be looking for and listening for, and we know what makes a patient a human. And so we want to capture that as well. So even if I'm treating somebody for pain management and I just need that vital sign of, is your pain a 10 or is your pain at two? We can get that off a video plus so much more, right? That we know from the research that that drives outcomes. If we just treat the pain doesn't necessarily mean we're increasing quality of life, we have to capture all of this other stuff as well. So as we continue to build out signals, absolutely, we can create signals for almost any type of condition because video in combination with other biometrics is that powerful.
Katie Frank (49:42):
Yeah. That's so crazy to think about in a really cool way.
Dr. Brett Talbot (49:49):
We get a little bit of like, okay, there's a little bit of minority report going on here. And really when you think about it this is AI for good. This is AI for treatment outcomes. This is ways that we can treat more people. And if you think about it, if I went to go get labs, I'm feeling sick, I have a viral infection maybe, I'm used to going in giving blood, I will literally go, let somebody take my blood to help find out what's going on with me, but in no other area like, are we as willing to do that and so doing a video submission and getting this type of data to our providers and allowing those alerts to trigger lifesaving events is a lot less invasive than a blood draw.
Katie Frank (50:39):
Yeah. And it also seems to provide a stop gap for those in between sessions where you're like, yeah, I was at four last time, but this next time I'm definitely going to be coming in hot at an eight.
Dr. Brett Talbot (50:59):
Yeah. Absolutely. As we see demand increase and our capacity to meet that demand decrease, we've got to figure something else out, right? Just adding more clinicians, trying to create more access in the way we are now, isn't moving the needle enough. We have to start to leverage digital health and technology tools to monitor patients at scale and surface the ones that need treatment immediately.
Katie Frank (51:25):
Would you say that providers are better or able with Videra to take better care of themselves? Are they less stressed about their patient load or-
Dr. Brett Talbot (51:39):
100% because often as providers were like, look, here's what I could do in the time we have, good luck, let me know if something happened and I can have actually more confidence. We got a lot of this feedback that something is running in the background checking in on my patients continually and letting me know when something's awry. I can really just focus on these more acute things knowing that... I'm able to sleep better at night knowing that people are checking in and I do have visibility into my entire caseload where normally, hopefully they call me if something happens.
Katie Frank (52:20):
Yeah. And crossing your fingers doesn't always work out, especially with someone's in a literal life or death situation, people would rather people are be safer.
Dr. Brett Talbot (52:31):
Yeah. People are pretty poor reporters. So even if something's going on for them to realize that it's going on, or I'm having a side effect and take that initiative to call doesn't always happen, but we've seen people really, really easily just click on a link, do a two minute video and then all of a sudden we realize something's off even if they didn't. But right now we depend so much on them to manage their own care and they're just not very good at it.
Katie Frank (52:58):
Like self-report and have the vocabulary for it. That's something that most people don't have. They don't have the vocabulary to say, oh, I'm feeling X, Y, Z, because of this, or not be able to recognize why they feel a certain way.
Dr. Brett Talbot (53:12):
Yeah. They don't know what to look for and listen for and so we have to get a face to face so we can, and that's just not a scalable model.
Katie Frank (53:19):
Yeah. Wow. This was incredible. I think that covers all of our questions. Michael, do you have anything else to add?
Michael Thurston (53:30):
No. This is great. I'm thinking about what Brett said in that you're able to get a baseline of a specific individual. And I think that's so cool because when you go into the doctor, you have those smiley faces to sad faces of your pain, right? And I'm like, well, is that my 10 pain tolerance or is that my wife's pain tolerance because if it's my wife's I'm much more of a wimp than she is. I'm always trying to get that right and if you have a baseline of like, no, this is your pain tolerance and this is where you say your pain's at, that becomes much more personal and so much more valuable. So what an awesome tool to provide personalized healthcare and help improve those outcomes in an individual way. This is awesome.
Katie Frank (54:24):
Yeah. It's really incredible.
Dr. Brett Talbot (54:26):
Katie Frank (54:26):
Thank you both for your time today. This was a really great session and I hope that the attendees learned a lot. And I hope that those of you who are viewing this on demand were able to learn something and apply it into your everyday life. And if you have any questions, you can always reach out to Veracity, our website's veracitysolutions.com. You can also find us on LinkedIn. And we'll also have Michael and Brett's company information available for everyone. Michael, Brett, if there's anything else go ahead.
Dr. Brett Talbot (55:02):
Thank you everyone for joining and listening.
Michael Thurston (55:03):
Katie Frank (55:04):
All right. Thank you every one for joining us. Have a great rest of your week.