Misc

Apps on Prescription?! – Perspectives on Digital Health Applications (DiGA)

Some time ago, we carried out an evaluation of the Digital Health Applications Ordinance (Digitale-Gesundheitsanwendungen-Verordnung, DiGAV) for the Federal Chamber of Psychotherapists in Germany (Bundespsychotherapeutenkammer, BPtK) focusing on the security of digital health applications, often referred to as apps on prescription.

The audit was intended to determine to which extent security guidelines, security objectives, and best practices are adhered to by the requirements formulated by the ordinance, thus enabling the foundations to securely operate digital health applications. The main subject of the examination is whether requirements, including procedural requirements defined in the ordinance are sufficient to ensure security of digital health applications. The examination has shown that the requirements can be seen as positive. However, in order to be able to make reliable statements about the IT security of digital healthcare applications, further details and mechanisms should be clarified within the ordinance, which I would like to present in the following.

Regulative Framework

In December 2019, the Digital Healthcare Act (Digitale-Versorgung-Gesetz, DVG) introduced the app on prescription for patients (Sections 33a and 139e of the German Social Code Book V). This act ensures that the approximately 73 million persons covered by the German Statutory Health Insurance (Gesetzliche Krankenversicherung, GKV) are entitled to use a digital health application (DiGA) prescribed by a physician or psychotherapist.

After completing an application assessment by the Federal Institute for Drugs and Medical Devices (Bundesinstitut für Arzneimittel und Medizinprodukte, BfArM), the DiGAs are listed in a directory of reimbursable digital health applications (DiGA directory). The procedure is regulated by the Federal Ministry of Health (Bundesministerium für Gesundheit, BMG) in a supplementary legal regulation, the DiGAV.  A DiGA is a CE-marked medical device of the risk class I or IIa, according to the Medical Device Directive (MDD) and also the Medical Device Regulation (MDR) which will supersede the MDD.

Disclaimer

This evaluation is based on experiences from projects with a security testing perspective, penetration tests, concept reviews, and audits. As we are security researches and not legal practitioners, we cannot provide legal advice and our statements regarding legal issues should be taken with a grain of salt. However, ERNW is not a manufacturer of a digital health application or auditor of a notified body, certifier, or similar, who might have a different view on the requirements. In addition to relevant expert knowledge for identifying security vulnerabilities in medical devices, systems, and environments, the authors have background knowledge in medical informatics and basic knowledge about the certification of medical devices.

Requirements to Safety, Security and Functional Capabilities

§ 3 para. 1 DiGAV defines that the digital health application’s safety and functional capability is considered well-founded by the proof of the CE conformity assessment. A medical device’s development is accompanied by a complex process that stipulates a risk analysis and safety requirements. This risk analysis must weigh up the benefits of the product with the possible risks and identified hazards. Proof of safety and therapeutic or diagnostic performance for a medical device is proven by conformity assessments. IT security is assessed using the assessment procedures defined in the MPG or Regulation (EU) 2017/745 on medical devices (MDR), mainly in the context of the potential impact on patient safety. Any security risks arising from vulnerabilities that do not impact patient safety are not explicitly mentioned in any of these legal frameworks.

Section 4 (1) DiGAV states that digital health applications must guarantee security requirements that follow the state of the art, considering the type of data processed and the associated protection levels. The legally indeterminate term state of the art is not clearly defined within the document. In contrast, the BfArM guide reduces this ambiguousness by referring to the BSI standard 200-X family, modules of the BSI IT basic protection compendium, and BSI technical guidelines. The BMG has defined additional criteria for ensuring security and data protection by the DiGAV to evaluate digital health applications which goes one step further than the MDR.

Certificates

Certificates usually do not provide a sufficient proof that a product is secure, as they exhibit a perspective on an application’s security posture at a given time for a given scope by a single auditor or tester. A validity of up to twelve months, combined with comparatively short release cycles, is not appropriate for making precise statements about the product. Also, the BfArM emphasizes the aforementioned in Section 3.4.1 in its Fast-Track Guide, denoting that security needs to be incorporated:

In order to meet the high market dynamics and the fast release cycles of DiGA, the DiGAV takes the approach of regarding information security less as a conglomerate of technical measures, but rather as a process to be anchored in the company.” [1]

Certificates often attest processes and mechanisms in the product’s development cycle to sustain security and deal with vulnerabilities during the entire product life cycle. There is no such certificate in the context of security and no accredited certification body yet. The DiGAV additionally states in §7 para. 2 sentence 2 that a corresponding body must be accredited according to §39 Federal Data Protection Act (Bundesdatenschutzgesetz, BDSG). It is questionable whether a certification body for corresponding security certificates should be accredited according to data protection criteria. The product’s evaluation shows significant differences in content, the methodology used, plus technical depth.

The BMG needs to give a precise statement, differentiating between data protection and security and related accreditation bodies’ requirements. It remains open which certificates will be deemed as suitable by the BfArM and how the BSI will be involved. We encourage regular and mandatory security assessments specifically focused on the digital health application, reviews of the application design and architecture, code, and penetration tests of the entire digital health application from a security perspective.

Significant Changes

§ 18 para. 2 DiGAV defines that changes with a significant impact on data security are among the fundamental changes in the sense of the regulation. This is a positive fact since subsequent changes to an application or system’s security mechanisms should imply the re-evaluation of the mechanisms’ effectiveness and assessment of potential residual risks. The reporting of significant changes is not required by the DiGAV but by § 139e para. 6, sentence 1 SGB V.

It is noteworthy that security-related changes are regularly made in the context of the remediation of vulnerabilities. The regulation does not require reporting all vulnerabilities to the BfArM, but only the subsequent remediation of those vulnerabilities through significant changes. A residual risk analysis or effort estimation is almost always performed to prioritize the remediation and accept a subset of vulnerabilities and risks. This information is of particular interest in addition to the implemented remediation.  CERT-Bund should be included in CVD processes to promote the adherence to realistic deadlines for the reporting and remediation of the vulnerabilities and to monitor the process by national security authorities.

Contents of the DiGA directory

§ 20 DiGAV defines the contents of the DiGA directory. It is noteworthy that no information on security or the product’s protection requirements is given in the directory. In the interest of product transparency, it makes sense, for example, to provide a Manufacturer Disclosure Statement for Medical Device Security (MDS2) form. The MDS2 form enables manufacturers to submit information on their products’ security-relevant features and functions to operators in a structured form and describes the product’s security-relevant functionalities and mechanisms. This form should enable operators and users to take optional protection requirements and organizational and technical measures for the safe operation of a medical device. It is suggested to add a contact for the reporting of security vulnerabilities in the directory such that it is clear how a CVD process can be initiated.

Annex 1: Requirements for Data Protection and Security

The DiGAV divides Annex 1 into requirements for data protection and data security. The data security requirements is divided into fundamental requirements for all digital health applications and additional requirements for health applications with high protection needs.

The given structure is intended to keep time and resources within a reasonable range by simplifying the facts’ evaluation by questioning the fulfillment of the criteria via yes-no statements. The following explanations illustrate why this may be unsuitable for evaluating the digital health application concerning security requirements. Data protection requirements overlap with security requirements (see requirements 14, 15, and 28). Data protection requirements are not assessed within this review’s scope, as the focus is on security. The BfArM guide to the fast-track procedure explains that the requirements are based on the Federal Office for Information Security (BSI) recommendations, especially for processes of a management system for information security (see BSI Standards 200-X), which are supplemented by modules of the IT-Grundschutz Compendium.

In terms of transparency, a view on a digital health application as an entire system is beneficial. Though, the questionnaire in Annex 1 does not inquire about security measures of the system’s single parts. Therefore, the questionnaire does not differentiate between individual components or subsystems. It remains unclear whether the requirements in Appendix 1 can be filled out separately for these subsystems, as this is necessary, especially in the context of digital health applications that represent complex communication systems in order to be capable of evaluating the implementation of the requirements. Furthermore, it remains open on how the requirements are exhibited clearly and understandably when a system comprises mobile applications, web applications, and additional desktop system software. This ambiguousness induces a complexity that should not be underestimated. This problem emerges from the fact that the fulfillment of the criteria is queried via yes-no statements. A detailed statement only needs to be made if Not applicable is selected.

The DiGAV does not require a digital health application manufacturer to create and submit a security concept. For a proper evaluation, it is recommended that the submission comprises the respective security concept. This concept enables the manufacturer to reference the detailed descriptions of the application’s security concept.

The formed requirements from both classes do not give any particular suggestions or requirements for digital health applications but ask the manufacturer for confirmation in the form of a self-disclosure. Annex 1 attempts to obtain a clear and assessable digital health application model, using the questions and corresponding yes-no answers. It is not uncommon for security vulnerabilities to arise from discrepancies between a socio-technical system’s specified and real behavior. In contrast to the manageable congruence with industry standards and security best practices, an evaluation of this discrepancy can hardly be identified with this methodology.

Detection of the discrepancy is possible, for example, through external penetration tests. According to Annex 1, these penetration tests are only required if the application’s protection need is evaluated as high. From a security researcher’s perspective, it is unquestionably recommended that each subsystem of a complex application is thoroughly and comprehensively tested at least once from a security perspective before productive application. For example, in the telecommunications or banking and insurance industries, this is regularly already part of information security management processes and a mandatory step before an application’s go live.

Furthermore, the DiGAV in Annex 1 also does not formulate any requirements for the CVD processes. A Coordinated Vulnerability Disclosure (CVD) can only be carried out successfully by a medical device manufacturer with established and well-defined processes. These processes must be established comprehensively and transparently within the company before vulnerabilities are reported so that remediation can occur on time. From a regulatory point of view, a time limit does not exist for vulnerability disclosures for medical devices in Germany, provided that the vulnerabilities do not affect patient safety. The DiGAV refers to MDD and MDR as digital health applications are CE-marked medical devices of the risk class I or IIa. It is recommended that CVD processes are established for digital healthcare applications, including a timeframe for remediation of vulnerabilities. Following the BSI standards 200-X as recommended by the BfArM, this expresses a considerable effort.

Cheers,
Julian

[1] Federal Institute for Drugs and Medical Devices (BfArM). The Fast-Track Process for Digital Health Applications (DiGA) according to Section 139e SGB V. A Guide for Manufacturers, Service Providers and Users. Online.

Leave a Reply

Your email address will not be published. Required fields are marked *