Q&A: LG on evolving its consumer offerings for health data collection

0

It is not often consumers consider their refrigerator or television as a tool that gives their healthcare providers personalized health data, but LG NOVA is considering the possibility of consumer electronics becoming data collectors for preventative care measures.

Atul Singh, general manager of digital health at LG NOVA, sat down with MobiHealthNews to discuss how the North American Innovation Center of LG Electronics works to improve the provider/patient healthcare experience in the clinical setting and is considering how it can evolve its consumer electronics to improve health outcomes.

MobiHealthNews: How does LG work in the digital health space?

Atul Singh: LG has been in the healthcare space for decades, but it’s primarily in the area of displays, TV monitors and radiology equipment in hospitals. So, essentially, we sell hardware to hospitals.  

What we are doing differently now is we are helping hospitals maximize their investment into these devices that they have purchased over the years to extract further value from it. 

The services we have are basically virtual health-related services. These are telehealth services. Imagine virtual nursing, where a remote nurse can work with a bedside nurse or the floor nurse to assist them with a variety of tasks. And these tasks could be as simple as medication sign off, for example, where they need dual signatures, some elements of discharge, or even nurse training. A senior nurse remotely can train junior nurses who are by the bedside on a variety of tasks. 

The other use cases are patient monitoring. So, [in the Smart Cam Pro] device, there’s a camera, a bunch of sensors, and an infrared camera. So, this device essentially allows a remote nurse to monitor multiple patient rooms. They could monitor up to 16 rooms today, but that number can easily grow. So from a remote location, they can monitor 16 patients and basically converse with them if they need to. Otherwise, they are just passively monitoring for activity. 

It’s two-way in the sense that we have built AI capabilities within the device. So the device is monitoring, because you can imagine a remote nurse watching 16 patients at a time 24/7 is very draining and it causes fatigue, screen fatigue, and they may not be paying attention.  

So what they can typically do is they can set the parameters for each patient that they want to monitor and the system will then keep an eye on that. 

MHN: Does the capability exist where notes can be generated for a physician?

Singh: We are introducing that capability now–ambient listening. So, the device has four microphones on top. So, it’s listening to the conversation that’s actively going on, whether it’s between the nurse and the patient, physician and the patient. And what we are doing is cataloging the entire conversation, and then summarizing the key output of the conversation so it can go in the patient chart. 

We haven’t deployed it yet. We are testing it just to make sure because it’s clinical conversation so some of the words that the doctor might be using or the nurse might be using may be clinical in nature or medical terminology. We don’t want the AI engine to misrepresent. So, a lot of testing needs to happen in that space.  

This is where we are starting, but our ultimate vision is to follow the patient to the home. So, in the home the customer or the consumer knows us through their interaction with our devices or appliances–the TV, the fridge, the washing machine and dryer, and so on.  

We want to then extend the care from the hospital once they get discharged into the home, and we want to enable these appliances and the devices that they already have made investments in to start offering care services. 

We have about 500 to 700 appliances in the market right now with consumers, and a large majority of them have intelligent sensors already built-in that are capable of collecting and analyzing information on user behavior. 

So, how often they use the device, when they use it, basically general patterns of usage, as well as the device itself or the appliance itself monitoring for the life of the device so that if something is going to go bad, we can alert the customer and proactively address it before the appliance breaks down.  

We have a lot more data about how the individual uses the appliance also–what time of day, how many times and so on.  

For example, how often do you walk in front of your refrigerator? So, it can tell, and if there’s a pattern that it has established that every day between 6am and 8am, there is some movement in front of the fridge, a few times, that’s normal behavior. Then when we notice that there’s been no movement or the movement starts now at nine o’clock for 10 minutes only, overtime, we can start using that data with other datasets to see if there’s something medically that is creating a challenge for this individual that instead of the six to eight, they have shifted their window. 

Or they completely stopped walking in front of the fridge. Did the location of the fridge change, or is there a medical issue that they’re not able to now come to the kitchen and do their regular tasks? But that’s a very loose data point. We cannot drive any inferences from there.  

But if we marry that with other datasets, like how often is the washer being used, the air purifier, or the TV? And we know the location of these appliances generally because of where the customer is, their zip code. 

Then we start looking at social determinants of health-type data and ultimately connect it with the clinical data of their providers to see, is there a change in the pattern? And if there is, can we do something with these appliances, with the smart TVs that they have, to start alerting the patient that, hey, you may want to do this or your doctor wants you to try something different. Or here’s just a simple alert that your medication is going to be up in three days. Do you want to refill? 

So, there are a lot of simple data points that we have right now, but in aggregate, they can bring intelligence to the interaction with the individual. 

MHN: How may these consumer electronics evolve to include health-related services?

Singh: Ultimately, you can imagine 10/15 years, whatever the time horizon is, to be able to do predictive analysis. So, if you see reduced usage of certain things or a different timeframe or what have you, there could be predictions made on that. There could be an onset of a medical episode and can it be stopped or addressed ahead of time. But that’s far. Right now, we are in the hospital learning, adjusting, improving the quality of care there, and then moving into post acute care into long term, and eventually home. 

Tech has to catch up a little bit. Regulatory framework has to catch up. Payment models have to catch up, but everybody is moving in that direction.

FOLLOW US ON GOOGLE NEWS

Source

Leave a comment