Archive for the ‘Guest Posts’ Category
In a previous blog post for Popper and Co, I discussed how telehealth can be a life-saving tool in rural and urban settings. As devices get more versatile and affordable, we will start seeing additional efficiencies in health care delivery. Moreover, patients will (if they aren’t already) start demanding it. But does telehealth work in every situation? And how should telehealth systems developers adapt to an individual practice’s needs?
The Center for Telehealth and Cybermedicine Research found that while enthusiasm for telehealth was high among patients and (some) caregivers, not every clinic could perceive a benefit. It is very easy, for example, to lose the advantages of this technology without first doing some preliminary research on your particular center and patients. Telehealth must be needs-driven, filling gaps in health services that are not effectively met.
In some cases, demand for telehealth may not be very high. If patients can find care at other facilities or may be reluctant to seek care for certain diseases, then telehealth may not be helpful. Similarly, if practitioners are reluctant to use telehealth tools, this reluctance may place such a system in jeopardy.
Electronic health record (EHR) integration can be one obstacle to telehealth (at least among practitioners). Aside from data security and confidentiality issues, practitioners perceive converting records to electronic format as a burden. However, conversion is becoming easier, and improvements in remote devices are allowing us to integrate many records and data of patient vital statistics into the system, including blood pressure, weight, oxygen saturation, etc. If developed correctly, EHR adoption, remote monitoring, and health information exchange (HIE) systems can be complementary to telehealth and improve efficiencies in provision of care, improve health outcomes, and reduce costs.
Reimbursement is another issue. A project conducted by our ophthalmology division screened patients with diabetes who were at risk for retinopathy. An eye specialist looked at retinal images with a camera that didn’t require eye drops (therefore, a highly skilled caregiver wasn’t needed at the patient’s location). Two hundred patients were scanned, and approximately 40 percent of them needed referrals. Of those referrals, 5 percent were in danger of going blind without immediate treatment. Here, telehealth provided better access, improved patient health, and reduced costs of care. Yet Medicare would not cover these types of diagnostic image interpretations (called “Store and Forward”) and related referral services, because it only covers that type of Store and Forward remote services in Alaska and Hawaii. On an optimistic note: Coverage requirements can change (in fact, Medicare has already changed coverage rules for some services)!
Other obstacles to telehealth success include:
- Not thinking about sustainability. Your program may have started out with a grant, but it needs to continue operating after the grant expires. Controlling technology costs is key, as is finding out which technology is most appropriate.
- What’s the best technology? It’s always changing, making it hard to know where to turn. And often, the latest tool isn’t the best solution for a specific practice or facility. At the Center, we are always helping end-users develop the right technology mix. Often, the right mix may have to be invented.
- Telehealth may not immediately fit into a practice’s workflow. If you only have one patient using your conferencing facility, that’s a problem. Reasonable volume is key to providing adequate return on investment, as is making telehealth systems scalable to incorporate other health services. You’ll need to develop a business operating plan stating how telehealth encounters will be scheduled, how to document each encounter, what you need to build and design, how many staff you need, and what your upfront and operating costs are.
- Lack of data. Make sure you document your encounters, and analyze whether your programs are successful. How many more patients did you see? What were the outcomes? What’s the impact on costs? This data also is crucial for systems developers to create the right solutions for telehealth.
Telehealth and information technologies are needed for healthcare reform in this country. It’s going to be an interesting time, getting people connected to care. But it’s the way we do it that’s going to make the biggest difference.
Do you work for a technology company that aims to make a difference in how telehealth is practiced? Are you a health care provider who believes telehealth can make a significant impact to your practice? What, if any, are its limitations? Please tell us what you think.
Tags: cybermedicine, health technology, medical technology, telehealth system developers, telehealth systems, teleheath, telemedicine
Posted in Guest Posts | No Comments »
Wireless technology is evolving in positive ways. It’s now more affordable, more accessible (thanks to broadband capacity), and more portable (via devices such as tablets and smartphones). And it is no exaggeration to say that this technology has made a life-saving difference for many patients who otherwise would not get care.
At the Center for Telehealth and Cybermedicine Research at the University of New Mexico, we studied the ability of telehealth tools (e.g., video connections, conference calling, electronic record sharing) to improve access and outcomes of rural New Mexicans suffering from a variety of health problems. In that role, we have been the incubator for several applications of telehealth designed to integrate the technologies that address important healthcare needs and gaps in access. One example was hepatitis C. While this disease is curable, multiple treatments are required and patients must be monitored for adverse effects. Project ECHO (Extension for Community Healthcare Outcomes) was initially incubated in our Center under the leadership of Dr. Sanjeev Arora. That project was recently published in the New England Journal of Medicine demonstrating how the program provided community healthcare providers with the expertise and tools they needed to treat hundreds, if not thousands, of people who previously were receiving no care for hepatitis C. In addition, outcomes of these remote patients were as good as outcomes of patients who traveled (often hundreds of miles) to the University’s medical center in Albuquerque.
This model was successful enough that it is now being expanded into other treatment areas, such as cardiology, rheumatology, and even adolescent psychiatry. For example, adolescents on Indian reservations, who have very high rates of suicide, are benefitting from counseling. Once the patient and practitioner are familiar with the technology, online counseling sessions are very similar to face-to-face encounters.
In addition to improving patient outcomes and access to care, telehealth can reduce costs in the clinic. At the University of New Mexico, our head of neurosurgery worked with The Center to set up a system where surgeons could view patient CT scans through a secure web portal. Because of this system, 44 percent of risky patient transfers were avoided, simply by looking at the scans remotely before surgery.
In rural New Mexico, the access improvements of telehealth appear obvious (though telehealth doesn’t work in every situation, an issue I’ll discuss in a future post). But the technology can also work in urban areas, bypassing transportation and traffic congestion problems by bringing virtual care to the patient. This is health care where it’s needed, when it’s needed.
One effect of health care reform that isn’t making headlines is that increased demand for services will be placed on a limited resource: existing health care providers. But telehealth systems will help meet this new demand by providing services to nearly everyone. For example, Dr. Arora, one of the few liver specialists in New Mexico, stated as we helped start his project that he couldn’t personally treat the 30,000 New Mexicans with hepatitis C at that time. But with the help of specialists—such as Dr. Arora and his team—at the touch of a button or the click of a mouse, community practitioners can readily access experts.
What do you see as the limitations of telehealth? Is rural New Mexico a truly unique niche for this technology? In my next post, we’ll discuss the importance of setting up an operating plan, and more cost-cutting benefits of telehealth. In the mean time, if you have any questions about my telehealth study or work, please post them here.
Tags: cybermedicine, health technology, medical technology, teleheath, teleheath study in New Mexico, telemedicine
Posted in Guest Posts | 3 Comments »
Last week I attended the 3rd annual mHealth Summit in Washington, D.C. Organized by the Foundation for the National Institutes of Health (FNIH), this multi-track conference attracted some 3,600 attendees and included representatives from across the health innovation spectrum, including industry, investors, entrepreneurs, policy makers, standards, NGOs, mobile operators, wireless technology producers, healthcare systems, insurers, pharma, regulators, researchers, and a multitude of others with an interest in the burgeoning space of ‘mHealth.’
While the lexicon for mHealth (an amalgam of “mobile” and “health”) is diverse and overlapping, a natural theme emerges if we look at the genesis of the term. The PC and ever-smaller, more powerful computer microprocessors spawned the digital revolution. Recently, we’ve seen the mobile revolution taking hold, wherein digital tools and wireless technologies have converged to allow us to be connected consumers, patients, and professionals. Now we are seeing a digital health revolution, wherein mobile, and the connectivity it provides for us, is enabling a new paradigm for health. Moreover, this phenomenon is spreading throughout the entire life sciences and health care ecosystem, including all strategics. To characterize all of this as simply being a combination of mobile and health is not only ambiguous (the term “mobile” has often been used interchangeably to mean a cell phone or mobility), but is somewhat disingenuous to the fundamentals that are driving this paradigm shift. Of course, mHealth is a very catchy and accessible term – and proponents have steadily broadened its meaning – so it’s often easier to make a concession in many modes of communication rather than fight a good-natured but losing battle!
In terms of how the overall ecosystem and, in particular, businesses are leveraging digital, wireless, and mobile, there are varying schools of thought and analyses – not surprisingly, the prism one views it through shapes the assessment.
For some analysts, like John Moore at Chilmark Research, the feeling is that “mHealth” is stuck in neutral. But this perspective is often colored by the framework of mHealth being mostly about health apps and mobile tools in healthcare settings and in population health management, and about smartphone health apps for consumers. The healthcare status quo is precisely what many are trying to disrupt, hence it will be those who can succeed at the peripheries, in novel ways, which may effect changes in the current business and reimbursement models. Necessarily, these innovations are going to see sporadic success, at least initially. Dr. Joe Kvedar, Director of the Center for Connected Health in Boston, echoes this same sentiment in his recent blog post “Is disruption of mainstream healthcare the answer to our crisis?”.
In an effort to track some of the better known startups making progress, healthcare startup accelerator Rock Health has compiled an extensive list of Digital Health Startups which, not surprisingly, are not household names. In total numbers these are small, but they represent the first wave of the digital health revolution coming our way.
As a bit of confirmation from an investment perspective, Dr. Mohit Kaushal, co-manager at the $100M West Health Investment Fund, indicated that one payer was planning to acquire 30 companies (MobiHealthNews). Counter-balancing this, yet another investor, Lisa Suennen, a respected commentator on the health innovation business, takes the view that investments are mostly stalled.
If we change our perspective once more, and look at the world while focusing on wireless technologies converging with health and healthcare, one segment of that market – remote patient monitoring – has seen revenues double in the past four years and this is expected to double again in the next four, according to research firm Kalorama (news). Moreover, wireless technology companies like Qualcomm, which is known for its wireless technology, particularly cellular phone chipsets, announced a new wholly owned subsidiary, Qualcomm Life, at the Summit. The new entity has launched an enabling wireless health and medical device platform hub device plus cloud data management platform called “2net,” which many hope will help catalyze the efforts of the 40-plus current partners and many more companies hoping to more-efficiently deliver wireless health solutions. Moreover, Qualcomm Ventures has established a $100M investment fund, which includes one particularly exciting company, AliveCor, whose primary product is the iPhone ECG, invented by Dr. Dave Albert.
From a clinical research perspective, the National Institutes of Health (NIH) has an mHealth Intra-Institute Interest Group, which tracks some 200+ research projects utilizing mHealth and wireless technologies. These tools are making monitoring and data gathering in clinical trials more efficient and powerful. Pharmaceutical companies and Clinical Research Organizations (CROs), like Quintiles, are also leveraging wireless tools in trials.
In summary, digital technology is everywhere and mobile connectivity – enabled by wireless – is one of the driving trends in health, healthcare, as well as the broader consumer markets. No less than CES, the annual consumer electronics show that highlights trends in all things digital that consumers love, will hold its second annual Digital Health Summit this year in Las Vegas (co-located with the main CES conference).
As Dr. Eric Topol, one of the keynote presenters at the mHealth Summit puts it: after having digitized everything else in our world, we are now digitizing man. (I highly recommend Dr. Topol’s 2009 TEDMED talk.) Not too long ago that would have been a very scary proposition (e.g., visions of robots taking over the world!), but the luddites are present in limited numbers these days. Quite simply, the imperatives for change – unsustainable healthcare costs, reduced access, an aging baby-boomer population, and marginal outcomes due to our “sick care” system – are being met head on by the new digital health technologies and concomitant emerging new business models. Ultimately, as digital meets health, resistance is futile. And that’s a good thing.
What are your thoughts on the potential ability of Digital Health (or mHealth) to meet current health and health care industry challenges? How are you, your patients or your customers using Digital Health today? We’d love to hear from you.
Tags: CRO, digital health, digital technology and health, FNIH, Health Summit, mainstream healthcare, mHealth, mobile health
Posted in Guest Posts | 2 Comments »
If you read Ken Walz’s recent post based upon his interview with AdvaMed Conference Producer Ray Briscuso, you already know that AdvaMed 2011 – bringing together more than 1,500 key MedTech executives from companies in every sector of the industry – promised to be an important conference, exhibition and partnering event for medical device, diagnostic and health information companies. Ken attended the conference – as did I – with the intent of absorbing all we could, sharing highlights, and reporting back via this blog.
From Ken’s perspective, one of the major themes permeating the remarks of many AdvaMed 2011 presenters was UNCERTAINTY. “While speakers referred to various causes for the uncertainty facing the MedTech industry, most of them tied it back to federal laws impacting health care and the potential for federal budget cutting to constrain the U.S. Food & Drug Administration’s ability to rapidly approve new devices and diagnostic tools,” he said.
For more on what some presenters and attendees had to say during and about the conference, read on:
Excerpts from Remarks Given by U.S. Department of Health & Human Services Secretary Kathleen Sebelius:
“For decades, the medical device industry has been a shining example of American innovation and creativity. Your products – from coronary stents to dialysis technology to cutting edge imaging – have played a key role in adding 30 years to the life of the average American over the last century. Our Department is committed to creating an environment where you can continue to innovate, improve lives, and create jobs.”
“As we look to the future, the medical device industry and the federal government share a vision of innovation unleashed. And the single most important thing the Food and Drug Administration can do to help make that vision a reality is to ensure the timely approval of safe and effective medical devices.”
Highlights from “CEOs Unplugged” Panel on Policy & Advocacy:
Jim Mazzo, President, Abbott Medical Optics; SVP, Abbott
On state of MedTech industry: “We are fortunate to be a part of this industry in this country… Recognize that there are many positives in spite of the issues we are facing and don’t get caught up in the uncertainties. We’re being recognized as an industry that is going to provide jobs and is going to provide life-saving technologies.”
Steve Ubl, President & CEO, AdvaMed:
On the FDA 510(k) Process: “The program is fundamentally sound. AdvaMed is proud of our policy work in this arena and we feel we are making solid progress in our discussions with the FDA.”
On the Medical Device Tax, healthcare reform, and other policy issues that may impact U.S. innovation: “The device industry is an engine of medical and economic progress… We can lead the way in innovation as long as we continue to have incentives for research and development.”
Mike Mussallem, Chairman & CEO, Edwards Lifesciences Corporation:
On state of MedTech industry: “The aging population, technological innovations, and growth in the emerging world are the wind beneath this industry’s wings.”
“This conference isn’t about a ‘Wow, gee’ factor for any particular technology’s potential. It’s more about the direction in which the industry as a whole is headed and where or how technologies will fit within.”
– Paul Cronin, Business Development Manager, Medical Technologies Division, IDA Ireland (Twitter: @idaireland)
“Our participation at this year’s conference is the beginning of a broader partnership between the AdvaMed and AUTM and gave us an opportunity to talk about our global tech portal, which will be available early next year. It will allow universities to post information on their technologies available for licensing and for companies to search the database for free. We believe it will provide good match ups between university technologies and the companies interested in them.”
– Vicki L. Loise, Executive Director, Association of University Technology Managers (AUTM) (Twitter: @autm_network)
“The partnering forum at AdvaMed2011 was awesome – it was what brought us to the conference. For our business, partnering discussions allow for finding common goals and for great use of limited resources. These discussions are much more effective for us than simply having an exhibit booth. In particular, we appreciated the opportunity to connect with large diagnostic companies like Qiagen, J&J and Roche.”
– Montserrat Capdevila, Director of Sales/Marketing, International Relations, Johns Hopkins University Tech Transfer (Twitter: @JHUTechTransfer)
Obviously, these are but a sampling of the takeaways, highlights and industry insights from the AdvaMed 2011 conference. Were you in attendance? What were your key learnings? Add your comments below.
Posted in Guest Posts | No Comments »
Genomic research is accelerating at a rapid pace and improvements in technology are fueling these advances (as has previously been addressed within the Popper and Co. blog). We’re now entering a phase of evaluating how to incorporate translated genomic information into clinical testing. With this comes a critical need to verify how and when to use a test, how these tests can modify clinical care, and how this process translates into improved outcomes for patients.
In April, Margaret Piper, Ph.D., M.P.H., presented at the Personalized Medicine Partnerships Conference outside of Washington, DC. Dr. Piper is director, genomic resources, at the Technology Evaluation Center of the Blue Cross and Blue Shield (BCBS) Association. Her presentation, “Assessing the Evidence for Genomics: Focus on the Patient,” centered on the impact of genomics on administrative processes and the adoption of new technologies into clinical care. As Dr. Piper noted, “We’re generating a lot of information that relates genomics to disease, but we’re only just starting to gather information on how to translate this into treatments and medical decision making.”
Standing on the cusp of this rising tide of genomics data raises challenges for health providers (including medical school students) as well as for insurance administrators and health technology companies. The BCBS Technology Evaluation Center seeks to provide healthcare decision makers with patient-centered assessments of new technology based on evidence (or essentially, on published data). The center conducts systematic reviews, issues special reports, assesses clinical evidence, and uses a panel of independent medical experts to provide assessments to aid in creation of guidelines and practice patterns. Most importantly though, the center strives to evaluate outcomes that patients can appreciate.
After hearing Dr. Piper, I directed some questions to Caroline Popper of Popper and Co. to glean her insights. The first question involves the Center’s criteria for issuing basic guidance only on “published” outcomes.
JLM: Given that the Center is only providing guidance based on published evidence, what might this mean for timing in terms of adoption of new technologies into the marketplace?
CP: Making data-driven decisions and evaluating costs vs. benefits makes sense. However, there are other market forces at play that will drive adoption before rigorous assessment is complete—such as basic consumerism and the Internet. Over time though, a balance between sound data-driven decision making and what I call “irrational exuberance” will be found. Clearly, organizations like the Technology Center have a big role to play. Most importantly, they should make sure there’s no perception that simple cost-control is the only driver for adoption of technology into clinical practice.
The second issue that struck me was around coding for reimbursement since Dr. Piper explained how healthcare plans are still in the dark about how to code molecular testing, for example. She noted that BCBS is working along with various insurance plans to address this issue, and that coding changes that may be implemented in 2012 may help. Still, the issue of immediate processing bears some further thought.
JLM: What can be done today to help insurance plans have a clear path for covering molecular testing? Can you shed some light on pending coding changes?
CP: Dr. Piper may have been referring to the implementation of codes by Centers for Medicare and Medicaid Services that have more to do with the value of the information generated then the work units that generate the information. This process will reimburse tests that are very useful and improve quality while saving cost, even though the tests may be simple to perform. It will eliminate or reduce the common default practice of code stacking—a process that piles up as providers use as many existing codes for every component of a test as possible in order to achieve a higher reimbursement.
And lastly, I wanted Caroline’s insight on a part of the presentation centering on how device or diagnostics companies bring their technologies to Dr. Piper’s Tech Center for assessment. According to Dr. Piper, if a device or diagnostics company has a new technology, it would bring it to the attention of one of the 39 independent BCBS entities and then the entity would feed it up to the Tech Center. The Tech Center can’t talk directly to a company about all that is needed for a positive assessment, but the Center staff might schedule an hour to talk through some suggestions.
JLM: What advice would you give to a diagnostics company getting ready to schedule an hour with the Tech Center?
CP: Have a clear idea of what the test will claim, whether it goes to the FDA or if it’s a LDT (lab-developed test). Make sure you know how the test fits into the standard care paradigm, i.e., how the physician will use the information from the test in managing the care of the patient.
JLM: What questions should company representatives be prepared to ask and what should they have on hand for the meeting?
CP: They should ask what the Tech Center considers good measurable endpoints and whether it is feasible to collect these points during the validation process. They should come with the pilot results about the analytical performance of the test, a good understanding of the competing alternatives, and a reasonably comprehensive view of all the elements of an episode of care in which the test plays a role.
Do you have other questions for Caroline on this topic? Do you have experiences with how new diagnostic technologies factor into medical decision-making? Let us know your thoughts.
Tags: bcbs, genetics, genomic research, genomics, health providers, health tech, insurance
Posted in Guest Posts | No Comments »
Entrance-to-market is always a challenging process in the bio-med industry, but where one would normally consider product quality and peer assessment as lead indicators of success, it can be something quite different and unexpected that affects rapid market adoption—such as simple profit-based economics. This is a lesson learned by the San-Francisco-based company XDx, Inc. (Expression Diagnostics) in conjunction with the launch of its diagnostics test Allomap®.
XDx’s Vice President of Corporate Development and Legal Affairs Matthew J. Meyer recently presented at the 3rd Annual Personalized Medicine Partnerships Conference in Bethesda, Maryland. Here, I recap some of the highlights of Meyer’s case study presentation and then offer some insights from Ken Walz, one of the founder’s of Popper and Company.
According to Meyer, the heart transplantation market in the U.S. encompasses nearly 140 centers performing more than 2,000 transplants per year. With the average cost of the procedure at around $750K and topping out at nearly $1M when post-transplant therapy and care is included, the industry fully supports a growing $2B impact on the U.S. healthcare system. Transplant patients and hospital institutions not only face substantial costs in upfront care, but the patient must also pay for and endure between 20 and 35 painful biopsies in an attempt to reduce rejection and minimize immunosuppression.
Founded in 2000 with the goal of significantly improving patient management in transplant care and autoimmune diseases like lupus through the development and commercialization of non-invasive diagnostic tests, XDx gained FDA clearance in 2008 for its groundbreaking Allomap® molecular diagnostic test. AlloMap uses simple blood draws to achieve the equivalent of biopsy results by analyzing genetic activity (gene-expression testing) to predict the absence of heart transplant rejection. Allomap’s Molecular Expression Test has become a “standard of care” and after several successful validation studies, including the landmark IMAGE trial published in the New England Journal of Medicine in April 2010, it has been clinically demonstrated to be “non-inferior” to biopsy and written into the International Society of Heart and Lung Transplantation (ISHLT) guidelines—the first and only blood test ever included in such guidelines.
So what’s delaying Allomap® from achieving market success? One significant factor that appears to be at play is that the Allomap® test reduces the need for biopsy, a procedure which generates revenue for some physicians performing the test and the institutions in which the tests are performed. XDx has learned a lesson that other companies entering the personalized medicine market will surely need to heed—a product needs more than comparative clinical effectiveness and clinical validation to achieve rapid market penetration. Financial considerations matter and when costly invasive procedures are replaced, companies will need to engage all stakeholders and focus on a financial middle ground where physicians gain something of value.
Following are Walz’s insights on the implications of this case study and its potential impact on advances in quality care.
Jamie: Ken, what are your reactions to this case study? How do you feel about physicians being slow to adopt a new diagnostics tool because use of this test removes part of their revenue stream? Can this happen with other diagnostics tools?
Ken: I’m not surprised. Because of the financial structure of our health care system, reimbursement (i.e., revenue) comes into play often when a physician is deciding whether to order a test or to do a procedure. We have seen many examples of this across diagnostics.
Jamie: Is there anything you’d suggest XDx could have done to soften the marketplace in advance of approval of Allomap® and its introduction into the market?
Ken: Engage patients. Based on Matthew’s presentation (and from my recollection of hearing XDx’s CEO speak last year), the Allomap® test is a much better option for the patient than biopsy. Within the parameters of what FDA allows, XDx should continue to look for ways to raise awareness among patients who would in turn demand the test as an alternative to biopsy.
Jamie: What broader implications based on this example would you like to highlight?
Ken: This case study is somewhat typical of what should be expected from a health care system with multiple actors and misaligned incentives. That’s not likely to change so companies should look for ways to engage the ultimate user (i.e., patient) to try to force alignment of those incentives and to ensure the patient can receive a medical solution that is in his or her best interest.
Let us know your thoughts on this issue of profit-based incentives impeding the adoption of new health care technologies that are better for the patient and save the healthcare system precious dollars. Should these issues be more strongly addressed through and by advocacy groups? How do you feel about how these issues can impact patient access to quality care? We look forward to hearing from you.
Tags: allomap, entrance to market, guest posts, health economics, health tech, jamie lacey moreira, matthew meyer, xdx
Posted in Guest Posts | No Comments »