Given the potential of this tidal wave of CHD with healthcare applications, where vested economic interests are involved, it is unlikely to disappear and it is pointless trying to stop it. Instead, it isfar better to channel it, in order to generate a positive impact for patients, both individually and at the level of the healthcare system as a whole.
One of the key questions concerns the assessment of technological and medical validation of CHD with healthcare applications for their value as a veritable healthcare service. Of course, in many fields, the issues are not major and the impact will not be particularly important. For example, if a system that counts the number of steps per day does not perform very well, it will be a matter of deceit on the product’s quality, which from a legal point of view is unacceptable. However, the issues are of another and more serious dimension, if we consider a CHD that measures the aqueous humour of the eye, then determines glycaemia and thus controls an insulin pump. It is therefore important, in every medical condition, to evaluate CHD according to objective criteria.
Concerning the methodology, when such products claim to provide a health service, they must be considered, according to the European Council Directive 93/42/EEC of 14 June 1993 concerning medical devices. They coveran extremely vast category of products comprising different classes (type I to IV) [of devices ranging from sticking plasters to hip prostheses and from compression stockings to pacemakers. A study of the reliability and quality of measurements taken by a CHD is essential. As it is often the case with evaluation of medical devices, such studies should by randomized clinical trials, or even observational studies conducted in everyday practice, since even the notion of a comparator or a blinded trial is difficult to imagine. The evaluation will not give rise to hostile reactions from the designers or producers of the CHD with healthcare applications, because what will be required of them will be “fair” and proportional to medical safety concerns. The only misgivings of the companies present at the aforementioned debate concerned not the principle, but the slow pace of the regulatory procedures that govern clinical research and today lead to delays of 6 months or more. Without doubt, observational studies would be better suited than randomized, controlled, double-blind clinical research versus a comparator. It will be on the basis of these studies that reimbursements for the CHD with healthcare applications, considered medical devices, could be requested, provided that the brand name of the device is listed as one of the products or services reimbursed by the health insurance agency.
The second question concerns the respect for and protection of individual privacy with regard to the information collected and processed by these devices. The framework for this evaluation is the law 78–17 of the 6th January, 1978, relative to information technologies, to computer files and to privacy articles modified numerous times, notably to incorporate the European directive 95/46/EC of the 24th October, 1995 relative to the protection of personal data into French law. This text states that any processing of personal information must be declared and that any processing of personal healthcare data must be authorized beforehand. These procedures aim to ensure that the systems provide the necessary protection not only the confidentiality of data, but also their integrity and their availability. These three notions (confidentiality, integrity and availability) constitute the pillars of data security, and thus underline that confidentiality is not the only concern. Confidentiality is of course an extremely fundamental notion as it has a direct impact on the private life of patients, and implies that only those authorized should have access to this information. Access to this information by a person without authorization would be in violation of professional secrecy as defined by the articles 226–13 and 14 of the French penal code with regard to health data. The transmission of files to an unauthorized party would be regarded as diverting the file from its intended purpose, which is punishable in article 226–16 of the penal code by 5 years of imprisonment and a fine of 300,000 euros. In the same way, the person responsible for the security of computer system must implement the necessary measures to ensure that data are not damaged, or deformed, or destroyed. Finally, and without prejudice, other relatively restrictive measures, the fundamental rights of patients must be respected and notably their right to be informed, their right of access to their information, to rectify or to transfer their information, and their right to “be forgotten” which means they have the right to have information concerning them erased. It is perfectly possible to meet all of these different constraints, but as soon as health data are involved, the procedures to request authorization are particularly long.
Companies consider this element particularly prejudicial because these constraints indirectly lead to a distortion of competition with regard to CHD with healthcare applications. Indeed, the producers of the devices are located in various countries, in particular the United States or Asian countries, which do not have the same level of protection of individual information privacy. Of course, in law, all products available in the European Union are subject to the same obligations with regard to the protection of individual privacy. However, the sanctions and the means of reprisal with regard to non-European companies that sell their products over the Internet are difficult to implement, which gives them relative impunity.
An approach, called Privacy by Design is gaining greater and greater importance. It includes the respect of privacy even earlier in the project, as early as the design stage, by ensuring the pertinence of the data collected and by anticipating the information provided to and the rights of access of users [16].
However, CHD providers must be conscious that technological development in re-identification of de-identified patient data has become a major policy concern. As reported by Lee Tien [17], only in the past few years we have begun to realize how difficult it is to truly de-identify data, given the enormous amount of information about people that is publicly available to data-miners, including hospital discharge summary databases. Modern re-identification techniques do not depend on personally identifiable information - any information that distinguishes one person from another can be useful. Researchers recently used modern techniques to re-identify supposedly anonymized genetic samples - determining not just the names of some of the people who submitted the sample in the first place but also their entire families.
It is also important that CHD providers anticipate in their technical development the required counter-measures to the main threats, which can jeopardise data integrity and therefore patients’ health. As expressed by Ohno-Machado [18], a particular attention has to be paid to replay attack and External Device Mis-Bonding. In a replay-attack, a valid data transmission could be maliciously repeated by a hacker. For example, an attacker can first record the communication from the sensor that indicates a high glucose level based on some auxiliary knowledge about the victim. Then, the attacker could later retransmit the high glucose information pretending it is a “valid” message, which would cause the receiver (e.g., insulin pump) to deliver an incorrect insulin dose and put the patient at risk. One way to avoid replay attacks is to introduce timestamps [19] within the message, where one CHD with healthcare applications only accepts the message from the other device if the timestamp in the received message is within a reasonable time tolerance range. In this case, a replay attack would not be able to provide a valid timestamp by simply reusing the previously sniffed transmission. However, time stamping requires synchronization with mHealth devices, which may impose additional communication burden and reduces battery life. Concerning DMB attack, two system security issues have to be taken into account: the information leakage risk and the information injection risk. For the first type, a malicious “app” on an authorized phone can steal sensitive patient data that are intended to be transmitted to an authorized app. In terms of information injection risk, an authorized app can feed false medical information into the original authorized app by intercepting the connection between the authorized external device and the authorized smartphone. This injection risk is extremely dangerous for patients who are heavily relying on health monitoring apps (e.g., blood-sugar concentration, irregular heart rhythm, etc.).