Vital Signs Digital Health Law Update | Summer 2023 | Jones Day

Note From the Editors

We bring you Vital Signs, a curated, one-stop resource on the most notable digital health law updates from our U.S. and global contributors. In Industry Insights, we take an in-depth look at generative artificial intelligence (“GenAI”) end-user license agreements. With the improvements in GenAI usability and accessibility, workforces are increasingly using GenAI internally, the risks and benefits of which are tied to provisions in these end-user license agreements. In our Federal and State sections, you’ll read highlights of significant U.S.-based developments, including new data privacy laws in several states. In our Global section, we provide a host of updates from around the world. Thank you to our contributors who are committed to bringing you curated updates covering digital health developments of interest.

Lawyer Spotlight: Rob Latta, Emily Tait, and Kerianne Tobitsch

In this issue of Vital Signs, we highlight three lawyers who authored our Industry Insights feature story on GenAI end-user license agreements:

Rob Latta (Intellectual Property, Health Care & Life Sciences, and Technology, San Diego) is a transactional attorney focusing on technology-based life sciences transactions, including matters involving digital health, electronic health records, and artificial intelligence-supported drug development.

Emily Tait (Intellectual Property and Technology, Detroit) provides strategic guidance to health care and life sciences clients in connection with IP and data-related issues arising in the context of artificial intelligence, medical devices, and digital health products and services.

Kerianne Tobitsch (Cybersecurity, Privacy & Data Protection, Business & Tort Litigation, and Technology, New York) advises clients in the biotechnology and health care industries on investigations and compliance matters related to privacy, consumer protection, and cybersecurity, including for emerging technology matters such as artificial intelligence and machine learning.

Industry Insights: Risks and Rewards: Balancing GenAI’s Benefits With Potential Legal Uncertainty

While AI has been a hot topic for years, conversational generative artificial intelligence (“GenAI”) tools recently have become readily accessible and are being rapidly adopted as part of digital health platforms, providing vast opportunities for efficiency and innovation in connection with patient care, while also creating new risks. Historically, these tools (e.g., chatbot interfaces, clinical decision support, etc.) were largely adopted through business-to-business agreements with the AI providers. With the improvements in usability and accessibility of GenAI, workforces are increasingly using GenAI tools for a plethora of tasks within an organization. A key challenge with GenAI has been determining how to balance the potential benefits against the potential legal uncertainty and risks from its use. Most of these legal issues have yet to be resolved, and are already the subject of litigation, regulatory proposals, and policy discussions. Users of GenAI tools should be aware of the provisions in the applicable End User License Agreements (“EULA”) to better understand their rights and potential risks.

GenAI is a subset of AI which generates new content output in response to a user’s input. Traditional AI processes data, detects patterns, and uses predictive models to make decisions or provide a rules-based response. GenAI, on the other hand, receives prompts and uses its training data and models, such as Generative Adversarial Networks (“GANs”) and large language models (“LLMs”) to transform user prompts and ultimately create a wide array of new content output, such as text, code, images, videos, audio, and simulations.

GenAI’s ability to create new content from user prompts presents serious questions. May user prompts be used by the creator of the GenAI tool to improve the tool and its underlying training models? Who can claim ownership over the output­­ and user prompts—the GenAI provider, the user, the end-user, or none of these? Who bears liability for harm caused by GenAI outputs? While the applicable EULAs may shed some light on some of these questions, they are likely not dispositive of key issues. For example, a EULA may provide that, as between the user licensee and the owner/developer of the GenAI tool, the output is owned by the user. However, that does not resolve whether a particular output can actually be owned by the user or whether that output infringes on the rights of another, including the individual end-user. EULAs can vary greatly depending on the platform and whether they are accessed through a free, freemium, or enterprise version. Moreover, as technology evolves and new versions of these tools are released, the terms and conditions, including privacy practices, governing their use may also change.

Considerations Related to User Prompts

One of the most exciting aspects of GenAI is that the models can improve over time; however, this leads to various issues relating to the usage and ownership of inputs (i.e., user prompts). Various free and paid GenAI tools explicitly state that inputs may be used to train the platform’s creation model. If a user inputs company-sensitive or proprietary information, that information is thus available to train the tool, making the information available for use by the creators of the tool itself and potentially by others as learning data.

Cohere, one of many platforms that provides free trials of its platform, just raised $270 million in a Series C round to grow its enterprise version. This version allows enterprises to utilize GenAI on the cloud platform of their choice while keeping data secure. Cohere’s current Terms of Use require that the user grant Cohere “a nonexclusive, worldwide, royalty-free, irrevocable, sublicensable, and fully paid-up right” to user prompts for a variety of purposes, including sharing them with third parties.

These EULAs may also include clauses that permit users to either opt out of allowing use of their inputs as training data or limit the use to a model that is specifically trained for a particular customer. Other clauses may prohibit the use of inputs as training data altogether. For example, Anthropic’s Terms of Service for its pay-as-you-go Claude AI allow for Anthropic to use results and inputs “to provide, maintain, and improve the Services and to develop other products and services.” Some providers with paid enterprise models (i.e., private deployments) accept restrictions on the use of inputs as learning data, while their free public versions often do not offer the same protections. Some open, non-API models require users to fill out forms to disable use of inputs for training.

Even if inputs are not used for training, the confidentiality and security of inputted information may be at risk. Depending on the EULA, a GenAI company may potentially review, release, or sell the information and a third party may access it if the GenAI platform experiences a security breach. This may lead to unintentional disclosures of trade secrets, loss of legal privilege, and privacy compliance issues under applicable privacy laws, such as the European General Data Protection Regulation (“GDPR”) if the user prompts contain any personal information or data. This is especially relevant in the U.S. as 11 states have passed comprehensive privacy laws that will become effective over the next two years. Regulators, such as the Federal Trade Commission and data protection authorities have already started to address a range of data privacy, protection, and competition issues related to GenAI.

Some platforms, like Adobe and Microsoft’s Bing, include provisions that prohibit users from inputting confidential or private information into their GenAI platforms. However, not all GenAI platforms block user inputs that include confidential information, and some information may not be readily identifiable as confidential. Furthermore, some EULAs contain provisions that explicitly state that the provider does not guarantee the confidentiality or security of data used in connection with the GenAI service.

Many GenAI EULAs contain provisions establishing that the user is solely responsible for their inputs, including representations by the user that the input does not violate applicable laws or the rights of a third party. Some EULAs also contain language which states that the GenAI provider is under no obligation to review the accuracy or potential liability of inputs and prohibits the user from inputting information that violates the IP rights of others. For instance, Anthropic’s terms require the user to represent and warrant that their inputs will not violate any third party IP or data privacy laws, but permit the user to retain all rights, titles, and interests to the inputs. Retaining the IP rights to user inputs may become increasingly important, especially because the process and finesse of user prompt creation has led to the emergence of “prompt engineering” as a field and profession. Consequently, companies may derive competitive advantages from engineering of user prompts. Companies contracting with GenAI platforms may be able to negotiate the rights and protections related to inputs.

Considerations Related to Generated Outputs

Regarding output ownership, many GenAI EULAs contain provisions that either disclaim or assign to the user all rights, titles, and interests to outputs created by the system. However, this does not guarantee that the user has IP rights to the output. Moreover, enforcing any IP rights may lead to obstacles involving current IP laws, particularly in the U.S.

Current U.S. patent and copyright law regarding ownership of AI-generated outputs remains unclear. Recently, the U.S. Supreme Court denied certiorari in Thaler v. Vidal. Therefore, the Federal Circuit’s ruling that only human beings, and not AI systems, may qualify as inventors under U.S. patent law remains. Further, the U.S. Copyright Office issued guidance stating that AI-generated material is not protectable when a human solely provides a prompt that results in generated content. The degree to which AI can “assist” a human being in connection with an invention or work of expression is largely unresolved, and will likely be subject to continuing litigation and legislative debate.

Most EULAs include language indicating that different users may receive the same or substantially similar outputs in response to their inputs. For example, Adobe’s GenAI terms explicitly state that “[t]he output may not be unique and other users of generative AI features may generate the same or similar output. The Output might not be protectable by Intellectual Property Rights.” Users should be aware that certain GenAI providers may require them to license any rights to the output to the provider for further exploitation, including use as training data. For example, Bing’s Terms of Use state that by using the GenAI services, the user is “granting Microsoft, its affiliated companies and third party partners permission to use the Captions, Prompts, Creations, and related content in connection with the operation of its businesses (including, without limitation, all Microsoft Services), including, without limitation, the license rights to: copy, distribute, transmit, publicly display, publicly perform, reproduce, edit, translate and reformat the Captions, Prompts, Creations, and other content you provide; and the right to sublicense such rights to any supplier.”

Certain providers, on the other hand, may assign users the rights to and/or disclaim ownership of the output, thereby asserting that the user is liable for the output content. In such situations, liability could arise if the output content violates any IP or privacy rights of others—a risk that the user may be unwilling to take on, particularly given that the user generally does not know how the output was actually created by the underlying AI system. Users should also closely review any outputs as some providers, including Adobe and Microsoft, disclaim any warranties regarding outputs and “any implied warranties that the output will not violate the rights of a third party or applicable law …” Some software code-generating GenAI platforms may output code generated from open-source training data. Therefore, users must take steps to ensure they are managing the risks of GenAI use in software development and complying with open-source software licenses obligations to avoid disputes, including litigation.

Even if a provider’s EULA contains provisions that assign the rights of the output to the user, some terms restrict the use of such outputs. For example, some providers restrict all commercial use of outputs, and other providers only restrict the commercial use of outputs generated in free versions. There are also providers that place no commercial restrictions on outputs regardless of whether the output was created by a free or paid plan. Therefore, it is crucial for GenAI users to review the applicable EULA to fully understand any restrictions or limits on the use of outputs.

The risks associated with potential misuse of GenAI tools are vast. GenAI systems may hallucinate, provide inaccurate information, or generate harmful outputs. Many EULAs state that GenAI providers do not represent or warrant that outputs are accurate and limit or disclaim liability for inaccuracies or any damages caused by their services. Most EULAs also encourage users to independently evaluate the outputs through human review before relying on GenAI tools.

Lastly, users of GenAI tools should closely monitor the disclaimer and indemnification sections in a EULA. This is especially important if the GenAI output may be used in connection with a company’s external-facing products, services, or information. Some EULAs include disclaimers of all representations and warranties. Further, most EULAs contain provisions that favor and protect the provider. For example, Cohere’s indemnification requires the user to “defend, indemnify and hold harmless the Cohere parties from and against any claims, causes of action, demands, recoveries, losses, damages, fines, penalties or other costs or expenses” arising from or in connection with use of their GenAI platform. Thus, users must thoroughly review the output before using, sharing, or replicating it externally to minimize legal risks. This is especially important as many EULAs either state that the provider will not indemnify the user or require the user to indemnify the provider.

We are only experiencing the beginning of widespread use and growth of GenAI systems. Users should review the EULAs of these platforms to fully understand their rights and protect against risks. Companies should also develop internal governance procedures covering GenAI use policies, risk assessment processes, and trainings to further mitigate potential legal risks related to the use of such tools.

United States Developments

FEDERAL

FTC Proposes Updates to Health Breach Notification Rule for Health Apps and Consumer Health Technologies

In May, the Federal Trade Commission (“FTC”) announced a Notice of Proposed Rulemaking to amend its Health Breach Notification Rule (“HBNR”), seeking to clarify the rule’s application to health apps, fitness trackers, and other similar direct-to-consumer health technologies. The HBNR requires certain companies not covered by the Health Insurance Portability and Accountability Act (“HIPAA”) that access personal health records to notify consumers and the FTC when there is a breach of such data. The FTC emphasized the need for such amendments due to the increased amount of health data collected from consumers and new technological developments and business practices involving such data (e.g., use of marketing third-party tracking technologies).

FTC and HHS-OCR Issue Joint Letter to Hospital Systems and Telehealth Providers Regarding Privacy and Security Risks from Online Tracking Technologies

On July 20, 2023, the FTC and the U.S. Department of Health and Human Services (“HHS”) Office for Civil Rights (“OCR”) jointly announced a distribution of warning letters to various hospital systems and telehealth providers regarding the use of online tracking technologies (e.g., cookies and pixels) and collection and sharing of health-related data with third parties (e.g., for digital marketing purposes). It appears that the FTC and OCR may be preparing for broad enforcement of tracking technologies under HIPAA/the Health Information Technology for Economic and Clinical Health Act (“HITECH”) or the FTC’s Health Breach Notification Rule (“HBNR”). The FTC press release states the letter was sent to 130 hospital systems and telehealth providers.

This follows the December 2022 bulletin from the OCR focusing on covered entities’ obligations under HIPAA when using third-party tracking technologies on their websites and mobile applications. The OCR defines ‘”tracking technology” broadly to include “a script or code on a website or mobile app used to gather information about users as they interact with the website or mobile app,” such as cookies and pixel tags. The OCR noted that covered entities and their business associates can use tracking technologies only in accordance with HIPAA and applicable regulations. In doing so, the OCR provided a notably broad interpretation of how covered entities obtain individually identifiable health information when using such tracking technologies and interacting with individuals online or via mobile apps. According to the OCR, this may include data collection using such technologies on public-facing websites and mobile app services where the interaction relates to health conditions or services, even if the user is not a confirmed patient of the provider.

This joint announcement signals the potential for FTC enforcement against entities that collect and share health data using tracking technologies where HIPAA does not apply (e.g., consumer health apps/websites or health providers that have a hybrid function not subject to HIPAA) evidenced by recent enforcement actions against a prescription drug platform and ovulation tracking app not covered by HIPAA.

FDA Releases Draft Guidance on Securing Pre-Approval for Anticipated Changes to AI/ML-Enabled Device Software Functions

On April 3, 2023, the U.S. Food and Drug Administration (“FDA”) published new draft guidance on “Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence/Machine Learning (AI/ML)-Enabled Device Software Functions.” This guidance reflects the FDA’s approach in implementing section 515C of the Food, Drug, and Cosmetic Act, which was introduced in late 2022 and exempts manufacturers from submitting a new marketing submission when changes to a device are consistent with a Predetermined Change Control Plan (“PCCP”) previously approved by the FDA. This guidance provides recommendations on the information that manufacturers of AI/ML-enabled device functions should include in their PCCPs to satisfy each component of the plan (i.e., the description of modifications, modification protocol, and impact assessment).

FDA Publishes Discussion Paper on the Use of AI/ML in the Development of Drug and Biologic Products

On May 10, 2023, the FDA published a discussion paper on “Using Artificial Intelligence and Machine Learning in the Development of Drug and Biological Products.” The paper encourages dialogue with key stakeholders on the uses and regulation of AI/ML in the drug development cycle, from drug discovery to post-marketing safety surveillance and advanced pharmaceutical manufacturing. The discussion paper describes potential uses of AI/ML and identifies a number of considerations and concerns relating to model and data explainability, reliability, privacy, safety, security, and representativeness. In this paper, the FDA also poses several questions to stakeholders regarding optimal manufacturing and governance practices to support innovation and the responsible use of AI/ML in this area. The FDA accepted comments on the paper until August 9, 2023, and has announced a virtual workshop in September 2023 to facilitate further exchange on these topics.

FDA Issues Final Guidance on Premarket Submissions for Device Software Functions

On June 14, 2023, the FDA released final guidance for manufacturers on preparing premarket submissions for device software functions. This new guidance is consistent with the 2021 draft guidance and adopts a binary framework that requires either “basic” or “enhanced” documentation for premarket submissions based on the risk posed by the device. In these updated recommendations, the FDA clarifies that documentation levels should be considered in connection with the device’s intended use and encourages manufacturers to document software concurrently with its development. Importantly, the FDA continues to encourage device manufacturers to rely on consensus standards related to software and recognizes the dynamic and rapidly evolving nature of digital health.

HHS-OIG Announces “Modernization” of Compliance Program Guidance Documents

On April 24, 2023, the U.S. Department of Health and Human Services (“HHS”) Office of Inspector General (“OIG”) announced plans to update its compliance program guidance documents (“CPGs”). The OIG’s stated goal is to “produce useful, informative resources—as timely as possible—to help advance the industry’s voluntary compliance efforts in preventing fraud, waste, and abuse in the health care system.” In recent years, the OIG has developed CPGs directed at different segments of the health care industry, including hospitals, home health agencies, durable medical equipment suppliers, and clinical laboratories. These CPGs promote the establishment and utilization of internal controls to ensure compliance with relevant statutes, regulations, and program requirements. The OIG encourages health care entities and professionals to implement effective compliance programs through the use of these voluntary guidelines, which can assist in preventing fraud, waste, and abuse in the health care industry. Changes to the OIG’s CPG program are as follows:

  • The OIG will no longer publish updated or new CPGs in the Federal Register. Instead, all updated or new CPGs will be available on the OIG website, and the public will be notified of new or updated guidance via public listserv and communications platforms.
  • The OIG will now issue one General CPG (“GCPG”) and multiple industry-specific CPGs (“ICPGs”). The GCPG will address topics spanning the health care industry as a whole.

The OIG expects to publish the GCPG by the end of 2023.

STATE

Host of States Adopt Data Privacy Laws

  • Nevada Passes New Health Data Privacy Law. The Nevada legislature passed an amended bill imposing comprehensive requirements pertaining to the collection, use, and sale of consumer health data for non-HIPAA regulated entities. It would generally require a consumer’s “separate and distinct” affirmative consent prior to both collecting and sharing data and would prohibit the sale of consumer health data without the consumer’s valid written authorization. Notably, the legislation does not allow for a private right of action, requires a contractual agreement between regulated entities and processors prior to any processing of data, and prohibits geofencing in certain instances. The legislation is set to go into effect on March 31, 2024.
  • New York Enacts Geofencing Ban to Protect Individuals in Health Care Facilities from Ads. The state of New York enacted a geofencing ban last month as part of its 2024 budget bill. The ban, which took effect on July 2, is intended to protect individuals who enter health care facilities from being targeted by third-party organizations with digital ads on their mobile devices. The ban is similar to one that was passed in Washington State earlier this year as part of its “My Health My Data Act” (discussed below), except, unlike the Washington law, the New York bill allows health care organizations to geofence their own locations. These geofencing bans also protect people who are receiving reproductive health care services from being harassed by outside entities.
  • Washington Enacts First State Comprehensive Health Privacy Law. On April 27, 2023, Washington State Governor Jay Inslee signed the My Health My Data Act (the “Act”), marking the first comprehensive state consumer health information privacy law. This first-of-its-kind state law becomes effective March 31, 2024, and will impose new requirements on the processing and sale of consumer health data in the state. While the Act does not apply to HIPAA-regulated entities, it applies broadly to legal entities that conduct business in Washington, produce or provide products or services to Washington consumers, and determine the purpose and means of collecting, processing, sharing, or selling “consumer health data.” There is no exemption for nonprofit organizations and generally no threshold for applicability based on revenue or number of consumers within the state. The Act broadly defines “consumers” and broadly applies to “consumer health data,” defined as personal information that is linked, or reasonably linkable, to a consumer and that identifies the consumer’s physical or mental health status. This definition includes, among other things, biometric data, gender-affirming care information, reproductive or sexual health information, health data derived from non-health information that can identify a consumer, and “[p]recise location information that could reasonably indicate a consumer’s attempt to acquire or receive health services or supplies.” Among other things, the Act requires regulated entities to: (i) obtain consumers’ affirmative consent before collecting or sharing consumer health data; (ii) provide consumers with certain rights regarding their consumer health data; and (iii) enter into written contracts with processors relating to the use of consumer health data. The Act also prohibits geofencing in certain instances and makes it unlawful for any person or entity to sell consumer data without obtaining prior consumer authorization.

State AGs Turn Attention to HIPAA Enforcement

The Health Information Technology for Economic and Clinical Health Act (“HITECH”), passed in 2009, gives state attorneys general the authority to bring civil enforcement actions on behalf of state residents impacted by violations of HIPAA.

  • In May 2023, New York Attorney General Letitia James entered into a $550,000 settlement with a medical management company for failure to adequately protect patient data from a data breach. The medical management company was accused of failing to update its software in a timely manner and perform testing that would have revealed its vulnerability to cyberattacks.
  • In March 2023, James reached a $200,000 settlement with a local law firm arising out of a data breach in which the private information of 114,000 patients of the firm’s client-hospitals was compromised. The firm was accused of maintaining poor data security measures which violated both HIPAA and state privacy law.
  • In February 2023, the state attorneys general of Pennsylvania and Ohio reached a $400,000 settlement with an Ohio-based DNA center over a data breach impacting 2.1 million individuals nationwide. The impacted data originated from legacy databases that were not in business use, but that the Ohio-based center acquired as part of a decade-old acquisition. The center was accused of engaging in deceptive or unfair cybersecurity practices when its privacy policy made material misrepresentations about its privacy practices.

Because state attorneys general are increasingly scrutinizing the HIPAA-covered entities and prevention of data breaches, companies must closely monitor data privacy procedures to facilitate compliance with state and federal regulations.

State AGs Find Consensus on Consumer Protection Enforcements Against Health Care Entities

In April 2023, a bipartisan group of 11 state attorneys general—representing the states of Illinois, Kentucky, Mississippi, Nebraska, Nevada, Pennsylvania, South Dakota, Texas, Vermont, Washington, and Wisconsin—reached a $500,000 settlement with a telehealth vision care-focused company over violations of the states’ consumer protection statutes and health and safety laws. The company was accused of engaging in false and misleading advertisement of its online vision tests. This bipartisan action highlights what may be a growing consensus among state attorneys general regarding the use of state consumer protection laws to regulate the conduct of health care companies.

Reproductive Health Care Privacy in the Spotlight

In May 2023, District of Columbia Attorney General Brian L. Schwalb announced a settlement with an ovulation tracking app regarding the app’s data privacy practices. The company was accused of sharing private consumer information with third-parties through “software development kits” integrated within the app and linked to two China-based companies previously flagged for suspect privacy practices. The company agreed to pay a $33,333 penalty to the District and to implement enhanced security and disclosure practices. This action is emblematic of new state legislature and attorney general interest in reproductive health care privacy.

Florida Legislation Expands Definition of Telehealth

Effective July 1, 2023, new legislation modifying the definition of “telehealth” no longer deems an “audio-only telephone call” as excluded from the definition of telehealth. Emails and fax transmissions, when not part of a broader engagement, remain excluded from the definition of “telehealth.”

Idaho’s Virtual Care Access Act Facilitates the Use of Telehealth in the State

Idaho’s new Virtual Care Access Act became effective July 1, 2023. The bill amends Idaho’s existing law to make it easier to render telehealth in the state by changing or replacing certain requirements for providing “virtual care.” Of note, the new statute:

  • Utilizes a broad modality-neutral definition for “virtual care,” namely “an umbrella term that encompasses terms associated with a wide variety of synchronous and asynchronous care delivery modalities enabled by technology, such as telemedicine, telehealth, m-health, e-consults, e-visits, video visits, remote patient monitoring, and similar technologies”; and
  • Includes specific continuity of care obligations, including that the provider or a member of the provider’s group is available for follow-up care to those receiving services via virtual care and that such patients are provided with a method to contact the provider of record for the service.

Kansas Pharmacy Regulations Clarify Use of Telehealth Services for Establishing Provider-Patient Relationships

In Kansas, pharmacy regulations were modified, effective June 2, 2023, to remove prior language creating historic confusion as to the establishment of a provider-patient relationship via telehealth services. With this update, pharmacy regulations clarify that a “legitimate medical purpose” to meet a requirement of a valid prescription order is when a “drug is issued with a valid, pre-existing patient-prescriber relationship rather than with a relationship established through an internet-based questionnaire.” Previously, the undefined terms “an internet-based consultation and a telephonic consultation” were included along with an internet-based questionnaire.

Minnesota Extends Audio-Only Telehealth Sunset Date and Reallocates $1.2M to Study Telehealth Expansion

Minnesota’s governor signed into law Senate File 2995, amending the Minnesota Telehealth Act (Minn. Stat. § 62A.673), to permit audio-only telehealth until July 1, 2025, an extension of the previous sunset date of July 1, 2023. According to the Minnesota Department of Health (“DOH”), this occurs as the state is “develop[ing] permanent policies for those services.” At the governor’s recommendation, Senate File 2995 also reallocated $1.2 million intended to support telehealth studies in fiscal year 2023 to fiscal year 2024; these studies will examine the impact of telehealth expansion and payment parity on access to and quality of care, value-based payments, innovation in care delivery, and care disparities and equitable access. These developments coincide with the preliminary report of a DOH telehealth study that showed that, during the pandemic, the use of telehealth by Minnesotans with private health insurance significantly grew, with patients largely satisfied with the services received, and that “audio-only telehealth addresses narrow but important access issues, especially for Minnesotans in rural areas or with challenges accessing or using the technology supporting video-based telehealth.”

New Nebraska Law Provides for Telehealth Payment Parity for Many Providers

Nebraska insurance statutes were amended to require payors to provide payment parity for telehealth services with in-person services if the telehealth provider also provides in-person health care services at a physical location in Nebraska or is employed by or holds medical staff privileges at a Nebraska facility that provides such in-person services. The amendment’s author stated that the new requirement will “help[] enable better access to health care for Nebraskans … particularly in rural areas where there might not be as many opportunities for in-person services.”

Pennsylvania Proposes Modality-Neutral Definition of Telemedicine

Legislation was introduced in June 2023, Senate Bill 739, that would establish a modality-neutral definition for telemedicine as the “delivery of health care services to a patient by a health care provider who is at a different location, through synchronous interactions, asynchronous interactions or remote patient monitoring.” While the legislation includes telehealth practice standards imposing additional operational requirements on telehealth providers, the proposal also includes limited exceptions to licensure requirements for out-of-state providers engaging with existing patients in certain instances. The bill was referred to the Senate Banking and Insurance Committee on June 2, 2023, where it underwent amendments prior to its first consideration by the full Chamber on June 27, 2023.

Utah’s Repeal of Online Prescribing Act Streamlines Telehealth Rules in the State

Effective May 3, 2023, Utah’s Online Prescribing, Dispensing, and Facilitation Licensing Act (“Online Prescribing Act”), U.C.A. 1953 § 58-83-101 et seq., was repealed by 2023 Utah Laws H.B. 152. Removing this little-used and often confusing statute, which is limited to a specific set of virtual care offerings utilizing a questionnaire format, streamlines the continuing separate telehealth statute in the state. While the statutory definition supports multiple modalities for telehealth, an encounter that only involves an online questionnaire or patient-generated medical history will not be sufficient to meet the telehealth requirements.

Wisconsin Regulations Expand Telehealth Reimbursement and Coverage

Updates to regulations governing the Wisconsin medical assistance (“MA”) program that became effective after the termination of the federal public health emergency have expanded its permanent telehealth policy. “Telehealth” is newly defined and includes real-time interactive audio-only telehealth. The Department of Health Services must reimburse providers for covered “medically necessary and appropriate health care services” provided to MA recipients via telehealth, with few exceptions (e.g., home health aide and private duty nurse services). Additionally, psychotherapy and alcohol and other drug abuse treatment for MA recipients may now be conducted via telehealth. This expanded telehealth policy is intended to “provide more flexibility … to make it easier to receive health care.”

Global Developments

EUROPE

European Commission Proposes Major Reform of Pharmaceutical Legislation

On April 26, 2023, the European Commission published its proposed reform of the European Union’s (“EU”) pharmaceutical legislation (the “Proposed Reform”), the most significant revision since 2004. The Proposed Reform aims to: (i) address challenges such as high prices for innovative medicines and shortages of medicines; and (ii) ensure that the EU remains an attractive place for investment and a world leader in the development of medicines by adapting its rules to new technologies. Specifically, the Proposed Reform seeks to:

  • Create a single market for medicines to promote timely and equitable access to safe, effective, and affordable medicines for all patients across the EU (e.g., by establishing new incentives to encourage companies to make medicines available to patients in all EU countries);
  • Enhance security of supply and address shortages of medicines (e.g., through earlier warnings from companies on medicine shortages and withdrawals and by establishing prevention plans);
  • Accelerate procedures to reduce the administrative burden and authorization times for medicines to allow medicines to reach patients more quickly (e.g., by reducing the European Medicines Agency’s (“EMA”) assessment from 210 days to 180 days and the Commission’s authorization from 67 days to 46 days);
  • Offer an innovation- and competition-friendly environment for research, development, and production of medicines in Europe (e.g., by introducing regulatory sandboxes, which allow for the testing of new regulatory approaches for novel therapies under real world conditions);
  • Combat antimicrobial resistance (“AMR”) (e.g., by setting certain targets to be achieved by 2030, such as a 20% reduction in the total consumption of antibiotics by humans); and
  • Make medicines more environmentally sustainable (e.g., by strengthening environmental risk assessments for all medicines, including those already authorized, to limit the potential adverse effects of medicines on the environment and public health).

To meet these objectives, the Proposed Reform includes a proposed EU regulatory framework for all medicines (including those for rare diseases and for children), thereby simplifying and replacing the previous pharmaceuticals legislation. The proposed framework consists of:

  • A proposed Directive on the Union code relating to medicinal products for human use, which notably contains all requirements for authorization, monitoring, labeling, and regulatory protection, and other regulatory procedures for all medicines authorized at the EU and national level; and
  • A proposed Regulation laying down Union procedures for the authorization and supervision of medicinal products for human use and establishing rules governing the EMA. Notably, the proposed Regulation sets specific rules (in addition to those in the above-mentioned proposed Directive) for innovative medicines authorized at the EU level; provides rules on the coordinated management of critical shortages and the security of the supply of critical medicines; and sets out rules governing the EMA.

The Proposed Reform also consists of: (i) a Communication on revising the pharmaceutical legislation and measures addressing AMR, which seeks to explain the reasons behind the Proposed Reform; and (ii) a proposed Council Recommendation on ramping up EU actions to combat AMR in a “One Health” approach, which was adopted by the Council on June 13, 2023. The One Health approach is based on the principle that human, animal and environmental health are intrinsically linked. AMR, which has been identified by the EU as one of its top three health threats, results from over- or misuse of antimicrobials, in both health care and food production systems. Therefore, the fight against AMR requires a One Health approach, addressing human, animal and environmental concerns and involving a broad range of actors. The Parliament and Council will next discuss the Proposed Reform. The Commission stated that discussions will start “as soon as possible, but [it] cannot predict the timing for adoption at this stage.” Further information on the Proposed Reform is provided in the Commission’s Q&A on revising pharmaceutical legislation and Q&A on the proposed Council Recommendation on combating AMR.

European Commission and WHO Launch Landmark Digital Health Initiative

On June 5, 2023, the European Commission and the World Health Organization (“WHO”) announced the launch of a landmark digital health partnership based on the EU Global Health Strategy and WHO Member States Global Strategy on Digital Health. In practice, the WHO will adopt the EU system of digital COVID-19 certification to establish a global system, foster global mobility, and protect citizens across the world from ongoing and future health threats. This development marks the initial step in the establishment of the WHO Global Digital Health Certification Network (discussed further below), which will encompass a diverse range of digital products aimed at improving overall health care outcomes for all individuals.

Council of the European Union Recommends Joining WHO Global Digital Health Certification Network

On June 27, 2023, the Council of the European Union adopted a recommendation on joining the Global Digital Health Certification Network established by the WHO and on temporary arrangements to facilitate international travel (the “Recommendation”). The Recommendation aims to ensure the smooth transition of the previous EU harmonized COVID-19 certification system to a WHO Global Digital Health Certification Network. Regulation (EU) 2021/953 on the EU Digital COVID certificate expired on June 30, 2023. As a result, as of July 2023, the possible issuance and acceptance of COVID-19 vaccination, test, and recovery certificates should be made on the basis of and pursuant to the conditions laid down by the national laws of the EU Member States. The WHO will establish a Global Digital Health Certification Network, which is a mechanism to support the verification of certificates that are issued by its participants. The network would initially concern COVID-19 certification and could, at a later stage, also include the certification of other documents, such as routine immunization records and the International Certificate of Vaccination or Prophylaxis.

The Recommendation encourages Member States to connect to the WHO Global Digital Health Certification Network as soon as possible, but no later than December 31, 2023. Furthermore, each Member State wishing to join the WHO Global Digital Health Certification Network is encouraged to issue a new certificate with the maximum possible technical validity and to register it in the EU Gateway. Member States may decide whether to connect to the Global Digital Health Certification Network and whether to do so with existing technology or to connect at a later stage.

EU eHealth Network Adopts Guidelines on the Electronic Exchange of Health Data

In June 2023, the eHealth Network, which connects the EU Member States’ authorities responsible for eHealth, adopted Guidelines on the electronic exchange of health data under cross-border Directive 2011/24/EU (the “Guidelines”). The Directive provides rules for access to cross-border health care in the EU, and the Guidelines include supplementary clauses to the general rules for the electronic exchange of health data included in the Directive. These Guidelines are addressed to the Member States and serve as a guiding principle for the development and national implementations of patient summary data sets. The Guidelines contain clauses on data protection, identification authentication and authorization, quality standards and validation, and education, training, and awareness, among others.

European Union Agency for Cybersecurity Publishes Health Threat Landscape Report

On July 5, 2023, the European Union Agency for Cybersecurity (“ENISA”) published its first cyber threat landscape report for the health sector in the EU. The report provides a detailed analysis of typical cyberattacks and identifies key threats, actors, impacts, and trends over the past two years. In particular, ENISA found that ransomware is one of the primary threats in the health sector, accounting for 54% of security incidents. The report also highlights that patient data, including electronic health records, were the most targeted assets by hackers.

European Data Protection Supervisor Publishes Annual Report 2022

In April, the European Data Protection Supervisor (“EDPS”) published its Annual Report 2022. The report provides valuable insights into the main EDPS activities carried out in 2022, including the processing of personal data in the health sector, for which the EDPS has provided guidance and expertise. Also in 2022, the EDPS remained committed to assisting EU institutions, bodies, offices, and agencies (“EUIs”) in adapting COVID-19 measures over time. A key aspect of this support involved helping EUIs conduct inventories of the measures implemented during the health crisis. Other activities included issuing a Joint Opinion with the European Data Protection Board (“EDPB”) on the EU Health Data Space and on the Extension of the EU Digital COVID Certificate.

European Data Protection Board Adopts Guidelines on Right of Access

On April 17, 2023, the EDPB adopted the final version of guidelines on data subject rights regarding the right of access. These guidelines delve into various aspects of the right of access and offer precise guidance on its implementation in different scenarios, including the medical sector. The guidelines provide clarity on the scope of the right of access (e.g., the inclusion of medical files), the information that controllers must provide to individuals, the appropriate format for access requests, the primary methods for providing access, and the definition of manifestly unfounded or excessive requests.

European Parliament Committee Adopts Draft Negotiating Mandate on the First-Ever Rules for Artificial Intelligence

A committee of the European Parliament, the Internal Market Committee and the Civil Liberties Committee, adopted a draft negotiating mandate for the AI Act. The AI Act was proposed by the European Commission on April 21, 2021, with the purpose of establishing a clear legislative framework for developing, commercializing, and using AI systems in the EU. (See also Jones Day Vital Signs: Digital Health Law Update Summer 2021). The Committee proposed several amendments to the AI Act proposed by the European Commission. Among other things, it expanded the classification of high-risk areas to include harm to people’s health, safety, fundamental rights, or the environment.

Dutch Cabinet Develops National Vision of the Health Information System and Announces Continuation of Booster Program Start-ups and Scale-ups

In April, Dutch Minister Ernst Kuipers of Health, Welfare and Sport presented the National Vision of the Health Information System to the Lower House of Parliament. Kuipers explained that the increasing pressure on health care requires a new vision on digitization and information provision, which is necessary to continue to provide high quality, accessible, and affordable care now and in the future. The starting point for the vision is that necessary medical data must be available to citizens, patients, caregivers, informal caregivers, researchers, and policy makers to use for health, prevention, and/or care.

In May, the cabinet announced that it will continue for three more years (2023-2026) with its booster program for a strong start-up and scale-up business climate in the Netherlands: Techleap.nl. To support this initiative, Minister Micky Adriaansens (Economic Affairs and Climate) instructed the House of Representatives to allocate €15 million to the organization. The Ministry of Economic Affairs also makes direct public investments in young growth companies, including for digitalization in health care, through, for example, the Deep Tech Fund (€250 million, together with InvestNL) and Dutch Future Fund (€300 million, together with InvestNL and the European Investment Fund), as well as indirect investments through National Growth Fund projects.

Finnish Parliament Approves Act on the Processing of Social and Health Care Customer Data

On April 14, 2023, the Finnish Parliament approved the act on the processing of social and health care customer data. The act provides supplementary and clarifying provisions on top of the European General Data Protection Regulation (“GDPR”) when dealing with social and health care customer data and well-being data produced by the customer for the purposes of organizing and implementing social and health services. Among other things, the act harmonizes and clarifies legislation on the disclosure of customer and patient information and the right to access information.

Irish Health Authority Announces a Change of Case Management System for Medical Devices

On June 2, 2023, the Irish health authority announced a change of case management system for medical devices. Among other things, new cases will now be assigned to a dedicated email address.

Italian DPA Imposes €50,000 Fine on Regional Health Authority

On April 27, 2023, the Italian Data Protection Authority (“DPA”) announced that it had imposed a fine of €50,000 on a regional health authority for violating GDPR requirements. In particular, the DPA found that the authority had failed to implement appropriate technical and organizational measures to ensure the security of personal data (e.g., no anonymization or pseudonymization of personal data on the authority’s website) and infringed the principles of data minimization, integrity, confidentiality, and privacy by design and by default.

UK Information Commissioner’s Office Issues Neurotechnology Report

On June 12, 2023, the Information Commissioner’s Office (“ICO”) of the UK issued a report highlighting potential discrimination risks associated with the development and use of emerging brain-monitoring technologies. It stressed the importance of inclusive development processes for neurotechnology to prevent bias and inaccurate data. To address this issue, the ICO announced its intention to develop neurotechnology-specific guidance.

ASIA PACIFIC

Japan

On May 17, 2023, the Next Generation Medical Infrastructure Act (“NGMI Act”) was amended to relax the requirements for the use of personal medical information for research purposes. At present, the NGMI Act allows for the use of personal medical information for research purposes without an individual’s prior consent, but requires strict anonymization of such medical information. However, this strict anonymization requirement has deterred use of the system, especially for rare diseases, because the name of the rare disease must also be removed. In order to further promote use of the system and the research and development of therapeutic methods and pharmaceuticals, the amendment has introduced a new system to use pseudonymized medical information, which can now include rare diseases and other singular values. The amendment will come into force within one year from the promulgation date of May 26, 2023.

Recent and Upcoming Speaking Engagements

Colleen Heisey, FDLI, Advertising & Promotion for Medical Products Conference: Medical Device Promotion Enforcement, November 2023

Maureen Bennett, Peking University, Bringing a Medical Product to Market in the United States, October 2023

Maureen Bennett, ACI, FDA Boot Camp: Clarifying the Clinical Trial Process for Drugs and Biologics, September 2023

Laura Laemmle-Weidenfeld, PLI, Life Sciences 2023: Effective Life Sciences Compliance Programs and/or Investigations, September 2023

Cristiana Spontoni, Informa Pharma Law Academy, Clinical Trials, What You Need to Know, September 2023

Taylor Goodspeed, AHRMM, Legal Considerations and Compliance: Conference General Session on Human Trafficking, August 2023

Ann Hollenbeck, PYA 2023 Summer CPE Symposium: What’s Hot in Healthcare, State of the Healthcare Industry, June 2023

Ann Hollenbeck and Drew Jack, Plante Moran, Healthcare Regulation & Reimbursement Summit, May 2023

Cristiana Spontoni, EU Pharmaceutical Law Forum, Key Updates on for Behavioural Advertising though the Internet: New Developments for Cookies and Google Analytics, May 2023

Jessica Tierney, FDLI Annual Conference, Meat the Future: Regulation of Plant Biotechnology and Cultivated Meat, May 2023

Source

Share This Post