Skip to main content
  • Research article
  • Open access
  • Published:

A health app developer’s guide to law and policy: a multi-sector policy analysis

Abstract

Background

Apps targeted at health and wellbeing sit in a rapidly growing industry associated with widespread optimism about their potential to deliver accessible and cost-effective healthcare. App developers might not be aware of all the regulatory requirements and best practice principles are emergent. Health apps are regulated in order to minimise their potential for harm due to, for example, loss of personal health privacy, financial costs, and health harms from delayed or unnecessary diagnosis, monitoring and treatment. We aimed to produce a comprehensive guide to assist app developers in producing health apps that are legally compliant and in keeping with high professional standards of user protection.

Methods

We conducted a case study analysis of the Australian and related international policy environment for mental health apps to identify relevant sectors, policy actors, and policy solutions.

Results

We identified 29 policies produced by governments and non-government organisations that provide oversight of health apps. In consultation with stakeholders, we developed an interactive tool targeted at app developers, summarising key features of the policy environment and highlighting legislative, industry and professional standards around seven relevant domains: privacy, security, content, promotion and advertising, consumer finances, medical device efficacy and safety, and professional ethics. We annotated this developer guidance tool with information about: the relevance of each domain; existing legislative and non-legislative guidance; critiques of existing policy; recommendations for developers; and suggestions for other key stakeholders.

Conclusions

We anticipate that mental health apps developed in accordance with this tool will be more likely to conform to regulatory requirements, protect consumer privacy, protect consumer finances, and deliver health benefit; and less likely to attract regulatory penalties, offend consumers and communities, mislead consumers, or deliver health harms. We encourage government, industry and consumer organisations to use and publicise the tool.

Peer Review reports

Background

There is enormous optimism that mobile phone applications targeted at health and wellbeing (“health apps”) will improve access to cost-effective healthcare. The health apps industry is just a few years old, but already contains over 250,000+ products [1]. Governments and health professionals are actively involved in driving or funding app development [2, 3], endorsing existing apps [4, 5], and encouraging citizen use [6,7,8].

However, health app use could be harmful to consumers. For example, consumers may suffer loss of personal privacy, leading to reputational damage with subsequent financial, social, insurance or employment discrimination [9,10,11]. They may be misled into making unwanted purchases, or suffer direct financial losses [12]. Symptoms of ill-health may be exacerbated [13, 14], under-diagnosed [15], or over-diagnosed [16,17,18]. It is important for those involved in health app development (hereafter referred to by the umbrella term of “app developers”), regardless of whether they focus on the commercial, technical or health-related aspects, to be aware of the potential for negative outcomes, and to work towards producing apps that deliver benefits but also avoid, or at least minimise, possible harms.

To this end, government agencies that regulate health apps have begun to produce guidance to explain legislation and give advice on how to develop apps that are legally compliant and operate according to community values and expectations [19,20,21,22]. Industry groups have also released self-regulatory ethical codes that inform and instruct app developers on industry standards for professional practice (for example [23,24,25]). However, the full range of legal and professional guidance may not be readily obvious or accessible to app developers. This is partly because the pertinent guidance exists across a variety of sectors, including medical device, privacy, digital technology, advertising and finance, and is administered by separate and largely independent agencies [26]. App developers may not be aware of all relevant sectors, particularly those not specifically directed towards apps (for example, guidance about acceptable digital content more generally). App developers may also struggle with the depth of assumed knowledge in published guidance, particularly those from unfamiliar sectors. For example, medical device legislation is recognised as being confusing even to those who work professionally in medical device regulation [27, 28]. Further, similar terms can have differing meanings across sectors. For example, medical device legislation tends to use a narrow definition of ‘health services’, meaning the diagnosis, treatment and prevention of disease [29]. Privacy legislation tends to use the same term to mean assessing, maintaining or improving health [30].

App developers who do not conform to legal requirements may be penalised. Developers in the United States (US) recently settled with the Federal Trade Commission (FTC) for allegedly marketing health apps with unsubstantiated claims about their intended health function [31, 32]. There is a need for comprehensive and easily understandable guidance to assist app developers who wish to create legally compliant and ethically acceptable health apps. This paper describes the process of developing a guidance tool to assist app developers who are producing mental health apps distributed in the Australian market. The tool, the App Developer’s Guide to Law and Policy is publicly available through the Australian Communications Consumer Action Network (ACCAN) [33]. We expect that this tool could inform a template that would be transferrable to health apps more broadly, and could be used by developers working in other jurisdictions if populated with local laws and standards.

Methods

This study was part of a larger project looking at the policy environment for mental health apps, which was funded by the Australian Communications Consumer Action Network (ACCAN), the peak body for consumer representation in the telecommunications industry. We use the term ‘mental health apps’ to include apps that offer, or claim to offer, one or more of: information, diagnosis, monitoring, treatment and support in relation to mental health symptoms, behaviours or illnesses. We chose to focus on mental health apps because digital mental health tools, including apps, are being heavily promoted by governments and the World Health Organization (WHO) [7, 34, 35] yet are associated with a risk of harm, including a heightened risk of stigma or discrimination associated with loss of privacy, and potential risks to consumer safety from under-treatment. Conversely, false positive diagnoses and over-diagnosis (for example true positive diagnoses of mild, self-limiting conditions where treatment carries no benefit) may be common, especially if the line between mental illness and normal variations in mood or in response to life events is not presented clearly.

The current study was informed by consumer advocacy and also by a policy analysis relevant to mental health apps. We used a broad definition of policy, encompassing the range of tools that provide regulatory oversight over mental health apps along their journey from development, distribution and selection for use by consumers. These policies included legislative guidance, industry self-regulation, post-market certification and evaluation programs, as well as general consumer tip sheets about selecting an app.

Sampling

We used an iterative combination of broad and narrow search strategies to identify legal, industry and community guidance relevant to developers of mental health apps. We performed broad searches through PubMed and Google for literature on regulatory oversight of relevance to mental health apps. In order to capture legislative tools and key non-government influences, we searched for any policy authored by a well-known organisation that sought to influence the development, distribution or selection for use of health apps available in Australia and was applicable to mental health apps. Our inclusion and exclusion criteria are provided in Additional file 1.

We read through identified policies and hand searched their reference lists. We inductively identified six relevant regulatory sectors (privacy, security, digital content, promotion and advertising, consumer finance, and medical device) and searched purposively through the websites of the various Australian government departments providing oversight of each sector. Although we concentrated on Australian laws, we included legislation from other jurisdictions that was likely to be influential in Australia. For example, we included medical device legislation and guidance produced by the USA, the UK, the EU, and an international working group (the International Medical Device Regulators Forum, IMDRF), because the related Australian Government department (the Therapeutic Goods Administration, TGA) referred specifically to these policies. We also searched purposively through the websites of non-governmental organisations that had authored policies, looking for additional self-authored guidance, and following links and references.

We cross-checked our search strategy by circulating a preliminary list of identified policies to our research partners, including ACCAN. These partners confirmed that there were no major omissions from our list.

Tool development

We read through all the policies to identify key elements of health apps that attracted regulatory oversight, and organised our developer tool around these domains. We drew further on policies and related references to provide information in each domain about legal requirements, compliance, and best practices. We defined best practice to mean non-legally-binding guidance contained within policies identified using the above process. We drew on the expertise of our research team, which included several mental health clinicians, academics with experience in commercial bias and health policy, and a telecommunications consumer group. We cross-checked the breadth of the tool by comparing it against various parameters of app quality listed in the published literature [36, 37].

Results

We identified legislation and industry guidelines relevant to mental health apps from multiple, sector-specific domains. We identified 29 relevant policies, listed in Table 1. The key regulatory domains that we identified as being relevant to developers were:

  1. 1.

    consumer privacy

  2. 2.

    data security

  3. 3.

    content

  4. 4.

    promotion and advertising

  5. 5.

    consumer finances

  6. 6.

    medical device efficacy and safety

  7. 7.

    professional ethics

Table 1 List of identified policies relevant to developers of mental health apps in Australia

Most policies addressed just one or two relevant domains (10/29 and 8/29 policies, respectively). We identified a dearth of comprehensive information collating the broad range of relevant legislative and ethical guidance in an easily accessible form that well-intentioned app developers could use to guide their work. We extracted and collated the information into a single guidance tool (see Additional file 2). The following sections provide domain-specific information on: why each domain is relevant to health apps, underlying regulatory principles, and dominant policies that provide regulatory oversight. We also discuss any regulatory shortcomings, and provide recommendations for app developers and other key stakeholders including consumers, app distributors, industries and governments. In each section, we include an excerpt from the guidance tool as an example of how these analyses have been translated for a health app developer audience in the Australian market [33].

Consumer privacy

Health apps may access consumers’ identifying information and personal health information including diagnoses, symptoms and use of an app’s therapeutic services. Privacy legislation seeks to protect individuals by requiring consent for the collection, use, disclosure or retention of personal information including sensitive health information. It also requires that individuals be given access to, and have the option to correct, their information. In Australia, the applicability of privacy legislation to health apps depends on the type of developer, the developer’s annual revenue, and the nature of personal data collection. It is likely that most developers of health apps available in Australia will be bound by Federal privacy legislation (the Privacy Act 1988 (Cth)) [22, 38,39,40]. See Table 2 for key assessment questions to determine the applicability of privacy legislation in the Australian context.

Table 2 Sample developer guidance: Australian privacy laws and global best practices [33]

Current policies aimed at protecting the privacy of mental health app consumers are insufficiently enforced: many apps do not provide privacy policies, and those that do often deliver information via impenetrable documents that are difficult for non-technical users to understand [41,42,43,44]. In addition, some practices are unacceptable and/or unlawful even with full disclosure and consent [45]. For example, collection and sharing of sensitive consumer information, including mental health information, for reasons unrelated to the main function of the app is not allowable under Australian legislation [22]. This is because knowledge of sensitive information can have significant repercussions for individuals, including employment, insurance, and financial discrimination [45, 46]. The complexity of the so-called “mobile eco-system” is so high that app developers and app users, may not know who data is shared with, and for what purpose [22, 47]. Privacy policies rely heavily on consumer complaints, but consumers may not know the law and may be unaware of sharing of personal data or how the data is being used to define their reputation in terms of employability, insurability or fiscal dependability.

Recommendations

In the tool (Additional file 2), we recommend that app developers ensure that any practices to share/sell consumer data are transparent and appropriate with regard to protection of privacy and the interests of the general public. This is in keeping with other practices that are required to respect individual interests, such as human research, where formal codes of ethics mandate the consideration of issues such as transparency, consent, and overall acceptability of research practices [48]. At the moment, it is likely that developers of apps available in Australia will continue to share/sell the personal or health data of their consumers with third parties without specific consent, since this is an important business strategy for many apps [49] and it may be allowable in other jurisdictions [50,51,52]. In addition, enforcement of legal restrictions may be difficult. For example, it may be impossible for legislators to follow the chain of entities who have access to the data, and difficult for them to enforce local laws in relation to products that originate from other jurisdictions [40].

As a start, we encourage public discussion and advocacy in this field with the aim of increasing public awareness of current practices and possible implications, and generating public input into regulatory standards or a government-endorsed code about what is and what is not acceptable in terms of health app data-sharing. Consumers and governments should also exert pressure on industry bodies to adopt higher standards that improve consumer privacy. For example, app distributors should enforce the inclusion of user-friendly privacy policies for all health apps, device manufacturers should ensure that privacy-friendly settings are the default setting on phones, and data-brokers and other third parties should develop strong self-regulatory systems around practices for obtaining, sharing, using and retaining consumer data.

Data security

While privacy legislation requires certain security measures, there is no specific security legislation that protects consumer data from attack by third parties. See Table 3 for an excerpt from the guidance tool. Many health apps have inadequate security arrangements. For example, an external study of apps recommended on the English National Health Service (NHS) Health Apps Library found serious security inadequacies in many apps, including lack of encryption for stored data in 93% of apps (73/79) and transmission of unencrypted identifying information in 29% (23/79) [43].

Table 3 Sample developer guidance: Australian security laws and global best practices [33]

Recommendations

In our tool (Additional file 2), we recommend that developers pay strict attention to the security advice contained within legislative guidelines, and implement strong security protections on all health apps. There are a number of government and industry documents that provide useful detail on security measures for health app developers [22, 53, 54].

More broadly, we encourage greater public debate and discussion to increase public awareness about the inherent insecurities of digital technology. We encourage industry to prioritise innovation in health app security.

Digital content

Apps and app promotional material that contain easily accessible but inappropriate content may cause serious offence to or be unsuitable for viewing by consumers (including children) and communities. Apps available in Australia are bound by Federal Government legislation that regulates online content (the Online Content Scheme under Schedules 5 and 7 of the Broadcasting Services Act 1992) [55]. See Table 4 for sample guidance. Many are also subject to the content policies of commercial app stores.

Table 4 Sample developer guidance: Australian digital content laws [33]

Government legislation on digital content may not be effective because easily accessible information about the role and remit of the legislation with regard to apps is limited [55,56,57]. For example, the Australian legislation does not specifically mention apps, referring only to films and games. Furthermore, products are only regulated under this legislation if they are “provided on a commercial basis (i.e. for a fee)” [55] yet many commercially developed apps are monetised without an up-front consumer fee (e.g. via advertising revenue or sale of consumer data.) The legislation may also be ineffective because it relies on consumers to identify and file a complaint about offensive material.

In contrast, content policies from the major app stores of Apple and Google Play are likely to be very effective indeed, their dominant influence on the app market enabling them to act as the de facto regulators of acceptable content standards. This situation is problematic, since Apple and Google Play do not provide detailed information about their standards. Apple, for example, gives the following vague guidance:

We will reject apps for any content or behavior that we believe is over the line. What line, you ask? Well, as a Supreme Court Justice once said, 'I'll know it when I see it.' And we think that you will also know it when you cross it [23].

The values underpinning these kinds of guidelines are opaque to the community and therefore difficult to critique or challenge. This opens the door to both unreasonable and unnecessary censorship and offensive communication: potentially depriving the public of certain material likely to be considered acceptable within the general community, and enabling unbridled release and distribution of offensive or inappropriate content (for example, forums for mental health app users might be used as a means to engage in cyber-bullying or to exchange advice on self-harm).

Recommendations

In our tool (Additional file 2), we recommend that app developers be aware of and adhere to legislation around digital content. Due to the near market-monopoly positions of Apple App Store and Google Play Store, legislation will likely play less of a role than the content guidelines from these major app distributors. Consequently, we also urge the public and governments to exert pressure on app stores to be transparent about their standards for content, thus enabling critique of their practices and facilitating change where needed.

Promotion and advertising

Deceptive or misleading promotion of health apps is inconsistent with consumer rights in trade and commerce, may compromise user health and safety, and could contribute to excessive societal health costs by encouraging indiscriminate or inappropriate use of related health services. That is, apps may give false expectations of benefit, or may engender unrealistic expectations about wellbeing and unnecessarily medicalise the ups and downs of daily living [16, 58].

Promotional material includes the app descriptor in a commercial app store or developer website, and any information provided about the app under the auspices of the developer, such as external endorsements reproduced within an app descriptor, and information or reviews hosted on an app developer’s social media channels, even if that information is posted by a third party. Several regulatory policies pertain to advertising materials. First, consumer marketing legislation seeks to protect consumers from deception or misleading claims about products or services that they obtain [59]. Second, legislation about advertising of regulated health services has tighter restrictions on advertising techniques (for example, banning the reproduction of a user review or personal testimonial within an advertisement for health services) [60]. Third, medical device legislation, may be applicable to some health apps, and prohibits: advertising which is not for the device’s intended purpose; making reference to a serious disease without prior regulatory approval; or health professional endorsement [61]. Finally, the advertising industry has its own voluntary codes that set industry standards for various aspects of advertising, including appropriateness of content [62, 63] and acceptable advertising practices [64]. See Table 5 for sample guidance.

Table 5 Sample developer guidance: Australian advertising laws and global best practices [33]

Although this regulatory oversight appears extensive, its functionality is limited by lack of funds and a heavy reliance on consumer-initiated complaints. The consumer complaints mechanism may be inadequate if consumers are unsure or unaware about the extent of consumer protection laws (e.g. applicability to apps that were free to download and/or developed outside Australia) or where to make complaints, and if government agencies have inadequate resources available to respond to most concerns. In addition, the existing oversight does not provide sufficient protection for vulnerable consumers. It seems likely that many users of mental health apps will be distressed or worried and could be more susceptible to predatory advertising about unrelated topics that are a common source of anxiety (for example, weight, body image, sexual prowess). However, there are no restrictions on the content of advertisements that are targeted at mental health or other vulnerable app users in, for example, banner ads viewed while accessing or using the app, or in personalised ads that pop up during unrelated internet browsing. Further compounding the difficulty in regulating this kind of content is the fact that these types of advertisements are generally not within the developer’s control, but the product of ad libraries that developers choose to embed in their app.

Recommendations

We recommend that app developers become aware of and adhere to legislative and industry standards on advertising (see Additional file 2). For health apps, this includes NOT making claims that are unsupported by clinical evidence. In order to increase the levels of consumer protection, we encourage increased consumer awareness of consumer complaints mechanisms, together with increased funding of government consumer protection agencies. Consumer-focused legislation pertaining to trade promotion and advertising has proven to be a powerful regulator of health apps in other jurisdictions and may be the most suitable tool to provide strong oversight in this field [31, 32]. In addition, legislation around promotion and advertising may be particularly suitable as a primary regulator for health apps because apps are consumer-targeted and because it is applicable to all kinds of apps, not just those that fulfil the specialised definition of a medical device. We also urge the public and governments to place pressure on third parties to self-regulate: creating high standards for the entire chain of actors who participate in behavioural advertising in order to prevent inappropriate practices that target vulnerable consumers.

Consumer finances

Most app stores adhere to industry codes of practice around electronic transactions [65] and will provide a refund if an app does not function as described. We recommend in the tool that app developers who sell their products directly to consumers should follow similar business practices. Other financial matters are less regulated. App subscriptions can be unending, requiring active cessation by consumers, and there are no government recommendations about limiting the promotion of in-app purchases, except for apps directed at children [12]. See Table 6 for sample guidance.

Table 6 Sample developer guidance: Australian consumer finance laws and global best practices [33]

Recommendations

We recommend that app developers conform to acceptable refund practices and avoid making repeated offers for in-app purchases, particularly within the context of a mental health app (Additional file 2). We have recommended that app developers adhere to consumer advertising legislation, and this includes being transparent in advertising materials about the financial costs associated with app download and use.

In addition, we recommend that consumers and governments exert pressure on app distributors to exclude apps that promote in-app purchases within apps targeted at vulnerable consumers. We also urge distributors to mandate time-limits on subscription payments when apps remain unused for long periods.

Medical device efficacy and safety

Health apps that are intended to function as aids or alternatives to traditional medical services (for example by offering diagnosis, prevention, monitoring or treatment) carry particular risk of harm to users’ health and safety. These kinds of health apps may harm user health, by, for example, aggravating symptoms, delaying diagnosis, or driving unnecessary diagnosis and treatment [13,14,15, 18]. Consequently, these apps fall under medical device legislation, and may be subject to a high level of government scrutiny and control. There are multiple demands on medical device manufacturers, such as ensuring that the device is recorded on a central register, keeping records of quality control practices, collecting clinical evidence about the outcomes of device use and documenting any adverse events. The legislative requirements vary depending on the type of medical device and its associated risk of harm [27].

The application of medical device law to health apps is in a state of flux: the health apps industry is new and there has been debate globally over which health apps, if any, would be classed as medical devices. Recent publications by medical device regulators in several jurisdictions have sought to alleviate this confusion by releasing guidelines specifically about apps for health and/or wellbeing [6, 19, 29]. The Australian guidelines, for example, indicate that health apps will come under medical device legislation if they fulfil the definition of a medical device. The Food and Drug Administration, the US government agency responsible for medical device regulation, has also advised that it will not pursue regulatory oversight of health apps that fit their definition for low risk of harm, regardless of whether or not they fill the legal definition of their medical device law [66]. It is not yet clear whether other jurisdictions will officially adopt a similar policy. See Table 7 for sample developer guidance.

Table 7 Sample developer guidance: Australian medical device laws and global best practices [33]

We have identified several potential problems with policy oversight in the medical device domain. Medical device legislation is complicated and it may be very difficult to understand whether a particular health app comes under the legal definition of a medical device [27, 28]. The matter turns on interpretations of specialised concepts such as what constitutes a disease, and definitions of diagnosis, monitoring or treatment. In addition, for any given medical device, legislative instruments tend to rely on a risk-based classification to determine the particular suite of legislative demands, but developers may be ill-equipped to identify levels of risk or potential harms. Health app developers may be in breach of medical device law if they do not recognise when medical device legislation is relevant, do not think to check – or do not understand – their legal requirements. There are several reasons why the law may not sufficiently protect the health of consumers and communities: even if apps are included on medical device registers, apps change and update frequently and registers may not be designed to deal with this; medical device legislation is at least partly reliant on consumer complaints, particularly for low risk items, but consumers may not know to use this sector for complaint; and medical device law does not cover societal harms that may arise from predatory overdiagnosis [18]. In sum, it may be easy for developers and app stores with benign or malign intentions to produce and distribute health apps that do not conform to medical device law or that risk delivering health harms to consumers and society.

Recommendations

We recommend that all health app developers familiarise themselves with the legal requirements of their local medical device regulator, particularly in relation to any requirements for supporting evidence of efficacy (see Additional file 2). However, we also acknowledge that most health apps will not fit the legal definition of a medical device. Regardless, we encourage all developers of health apps to provide information to users about factors that may influence the apps’ likely efficacy and risk of harms (for example, whether or not the app was developed in conjunction with health experts, whether or not it incorporates recognised healthcare guidelines or is informed by other scientific evidence, and who is funding the app). Particularly, transparency about the scientific evidence (or lack of evidence) underpinning the app’s effectiveness on health outcomes is key.

Medical device legislation seems ill-suited to regulate health apps; the laws are complex, burdensome, and seemingly ineffective at regulating health and safety in a rapidly changing field. It may be more appropriate for a dedicated government body to take oversight, providing a central repository for consumer complaints and registration of adverse outcomes in all domains (e.g. loss of privacy, financial embarrassment, misleading claims, health harms). Such a body might exist as a division of an existing government entity that manages other electronic or digital health matters, or that provides consumer trade protection, and could liaise, as necessary with medical device regulators and other government departments. This body should be charged with considering the impact of health apps upon both individual consumers and more broadly upon society, and should act to regulate in a manner that protects consumer health and safety, and protects the public interest in avoiding “too much medicine.”

Professional ethics

Governments and industry bodies advise health app developers on additional parameters of app quality that are not covered in legislation, including accountability, truthfulness and transparency [67]. See Table 8 for sample guidance.

Table 8 Sample developer guidance: professional practice [33]

Recommendations

We recommend that app developers should provide details on key stakeholders, scientific sources, and the app’s monetisation strategy. Developers should ensure that their privacy policies, privacy practices and consent processes are not just box-ticking exercises that aim to limit their own legal liability, but actually enable consumers to protect their privacy while using apps (See Additional file 2).

Discussion

We have identified legislation and industry guidelines relevant to mental health apps from multiple, sector-specific domains. We extracted and collated the information into a single tool with the aim of providing guidance to app developers in this field. We anticipate that this tool will be useful for informing mental health app developers about legal requirements, industry standards, and professional expectations. Ultimately, we hope that this will lead to mental health apps that deliver better outcomes for app consumers and the public, and mitigate the risk of breach for app developers. We anticipate that mental health apps developed in accordance with this tool will be more likely to: conform to regulatory requirements, protect consumer privacy, protect consumer finances, and deliver health benefit; and less likely to: attract regulatory penalties, offend consumers and community groups, mislead consumers, and deliver health harms.

The process and result of preparing this developer tool highlighted several important problems with regulation of mental health apps. Legislation is siloed in separate government departments. We note above that many laws depend on self-declared compliance and consumer complaints, and that some laws are poorly expressed to facilitate this. However, the fragmented nature of the regulatory oversight adds an additional problem. Consumers who are motivated to complain may not know where or how to direct their concerns. For example, consumers who experience worsening symptoms associated with the use of an app to manage stress may wonder whether complaints and concerns should be directed at consumer protection agencies, medical device agencies or app stores. This means that illegal and potentially harmful apps may persist in the public domain. We suggest that a more cohesive regulatory framework might assist in providing effective oversight of health apps. This might include, for example, having a single point of consumer access for complaints about any aspect of health apps, and a central register for adverse health outcomes from app use, regardless of their assigned level of risk-of-harm. There is ample scope for further research in this area.

Limitations

We prepared this tool with reference to Australian legislation and international influences. Although it will be of most immediate practical use to app developers making products for the Australian market, it can be adapted to suit other jurisdictions by inserting local laws and guidelines into the existing template. There is, however, a particular lack of research on health app content and policy in countries outside of the Western, English speaking nations, and we would advocate for further work here [36]. The tool is not intended to provide legal advice, but it has been prepared by a team with expertise in relevant sectors, and we think that it provides valuable guidance to the range of laws and industry best practices that pertain to health apps. The tool may have additional limitations associated with the sampling methods. That is, we may have missed relevant sectors or important documents, although our inductive strategy for identifying policies was designed to minimise this. The tool will, inevitably, become out of date as laws and guidelines change, but can be readily updated with revised policies.

Conclusion

There has been much written about the potential for health apps to deliver benefits but, as with many health innovations, less attention has been paid to considering, measuring or minimising the possible harms.

We have created a tool that is suitable for developers who wish to create legally compliant mental health apps that accord with high professional standards of consumer care and protection. We anticipate that it will be helpful with navigating the complex array of legislative guidelines and industry best-practice standards across multiple relevant domains. The tool can be used at all levels of app production, from inspiration through to development and evaluation, and may also be a suitable template for post-market assessment. We encourage government, industry and consumer organisations to use and publicise the tool.

Other key stakeholders in the health apps industry, including consumers, governments, app distributors and third parties, also have a role to play in improving consumer protection and we hope that our suggestions for action might stimulate further discussion.

Abbreviations

ACCAN:

Australian Communications Consumer Action Network

Cth:

Commonwealth

EU:

European Union

FTC:

Federal Trade Commission

Health Apps:

Mobile phone applications targeted at health and wellbeing

IMDRF:

International Medical Device Regulators Forum

IVDMD:

In vitro diagnostic medical devices

NHS:

National Health Service

TGA:

Therapeutic Goods Administration

UK:

United Kingdom

USA:

United States of America

WHO:

World Health Organization

References

  1. research2guidance. mHealth App Developer Economics 2016. Berlin, Germany. 2016. [https://research2guidance.com].

  2. Proudfoot J, Clarke J, Birch MR, Whitton AE, Parker G, Manicavasagar V, Harrison V, Christensen H, Hadzi-Pavlovic D. Impact of a mobile phone and web program on symptom and functional outcomes for people with mild-to-moderate depression, anxiety and stress: a randomised controlled trial. BMC Psychiatry. 2013;13:312.

    Article  PubMed  PubMed Central  Google Scholar 

  3. VA Mobile Health. VA app store, Washington. [https://mobile.va.gov/appstore]. Accessed 3 Apr 2017.

  4. iMedicalApps. iMedicalApps, U.S.A. [iMedicalApps.com]. Accessed 1 Nov 2016.

  5. Agencia de Calidad Sanitaria de Andalucía. Safety and quality strategy in mobile health apps. 2012. [http://www.calidadappsalud.com/en/]. Accessed 21 Apr 2017.

  6. European Commission. Commission Staff Working Document on the existing EU legal framework applicable to lifestyle and wellbeing apps. Brussels, Belgium. 2014. [https://ec.europa.eu/digital-single-market/en/news/commission-staff-working-document-existing-eu-legal-framework-applicable-lifestyle-and].

  7. Australian Government. Australian Government response to Contributing Lives, Thriving Communities - Review of mental health programmes and services. Canberra, ACT: Commonwealth of Australia; 2015.

    Google Scholar 

  8. World Health Organization (WHO). mHealth: new horizons for health through mobile technologies: second global survey on eHealth. Geneva: WHO; 2011. [http://www.who.int/goe/publications/goe_mhealth_web.pdf].

  9. Office of the Information Commissioner Queensland: Guideline: Privacy and mobile apps. Queensland Government, Brisbane, Australia. 2014. [https://www.oic.qld.gov.au/guidelines/for-government/guidelines-privacy-principles/applying-the-privacy-principles/privacy-and-mobile-apps].

  10. Kaye K. FTC: Fitness apps can help you shred calories - and privacy. Advertising Age Detroit, MI; 2014. May 7.

  11. Lohr S. A 10-digit key code to your private life: Your cellphone number. The New York Times New York, NY; 2016. November 12.

  12. Commonwealth Consumer Affairs Advisory Council (CCAAC). App purchases by Australian consumers on mobile and handheld devices: Inquiry report. Canberra, ACT: CCAAC; 2013. [http://ccaac.gov.au/files/2013/07/M-commerce-Final-Issues-Paper_publications.pdf].

  13. Seko Y, Kidd S, Wiljer D, McKenzie K. Youth mental health interventions via mobile phones: a scoping review. Cyberpsychology, behavior and social networking. 2014;17(9):591–602.

    Article  PubMed  Google Scholar 

  14. Firth J, Torous J. Smartphone Apps for Schizophrenia: A Systematic Review. JMIR Mhealth Uhealth. 2015;3(4):e102.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Shen N, Levitan MJ, Johnson A, Bender JL, Hamilton-Page M, Jadad AA, Wiljer D. Finding a depression app: a review and content analysis of the depression app marketplace. JMIR Mhealth Uhealth. 2015;3(1):e16.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Jutel A, Lupton D. Digitizing diagnosis: a review of mobile applications in the diagnostic process. Diagnosis. 2015;2(2):89.

    Article  Google Scholar 

  17. Mackey TK, Liang BA. It’s time to shine the light on direct-to-consumer advertising. Ann Fam Med. 2015;13(1):82–5.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Carter SM, Rogers W, Heath I, Degeling C, Doust J, Barratt A. The challenge of overdiagnosis begins with its definition. BMJ. 2015;350:h869.

    Article  CAS  PubMed  Google Scholar 

  19. Medicines and Healthcare products Regulatory Agency (MHRA). Medical device stand-alone software including apps Iincluding IVDMDs. London: MHRA; 2016. [https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/564745/Software_flow_chart_Ed_1-02.pdf].

    Google Scholar 

  20. Federal Trade Commission (FTC). Mobile health app developers. Washington: FTC best practices Federal Trade Commission; 2016. [https://www.ftc.gov/tipsadvice/business-center/guidance/mobile-health-app-developers-ftc-best-practices].

  21. Federal Trade Commission (FTC): Mobile health apps interactive tool. Washington, DC. 2016. [https://www.ftc.gov/tips-advice/business-center/guidance/mobile-healthapps-interactive-tool].

  22. Office of the Australian Information Commissioner (OAIC). Mobile privacy: A better practice guide for mobile app developers. Canberra, ACT: Australian Government; 2014. [https://www.oaic.gov.au/agencies-and-organisations/guides/guide-for-mobile-app-developers].

  23. Apple Inc. App Review, Cupertino, CA. 2015. [https://developer.apple.com/support/app-review/]. Accessed 24 Feb 2017.

  24. Digital Advertising Alliance (DAA): Application of self-regulatory principles to the mobile environment. USA. 2013. [http://www.aboutads.info/DAA_Mobile_Guidance.pdf].

  25. GSMA. Mobile and privacy: privacy design guidelines for mobile application development. GSMA, London. 2012. [http://www.gsma.com/publicpolicy/wpcontent/uploads/2016/09/GSMA2012_Guidelines_PrivacyDesignGuidelinesForMobileApplicationDevelopment_English.pdf].

  26. Fernando JI. Clinical software on personal mobile devices needs regulation. Med J Aust. 2012;196(7):437.

    Article  PubMed  Google Scholar 

  27. Theisz V. Medical device regulatory practices: an international perspective. Boca Raton, FL: Pan Stanford Publishing; 2016.

  28. U.S. Food and Drug Administration (FDA). Mobile medical applications: Guidance for industry and Food and Drug Administration staff. MD: FDA, Silver Spring; 2015. [http://www.fda.gov/downloads/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/UCM263366.pdf].

  29. Therapeutic Goods Administration (TGA). Regulation of medical software and mobile medical ‘apps’, Canberra, ACT. 2013. [https://www.tga.gov.au/node/4316]. Accessed 22 Oct 2016.

  30. Office of the Australian Information Commissioner (OAIC). Health service providers Sydney, Australia. [https://www.oaic.gov.au/agencies-and-organisations/faqs-foragencies-orgs/health-service-providers/]. Accessed 15 Mar 2017.

  31. Federal Trade Commission (FTC). FTC cracks down on marketers of “melanoma detection” apps, U.S.A. 2015. [https://www.ftc.gov/news-events/pressreleases/2015/02/ftc-cracks-down-marketers-melanoma-detection-apps]. Accessed 15 Mar 2017.

  32. Federal Trade Commission (FTC). FTC charges marketers of 'vision improvement' app with deceptive claims, U.S.A. 2015. [https://www.ftc.gov/news-events/pressreleases/2015/09/ftc-charges-marketers-vision-improvement-app-deceptive-claims]. Accessed 15 Mar 2017.

  33. Parker L, Grundy Q, Bero L. App developer’s guide to law and pollicy: creating quality mental health apps. Sydney: Australian Communications Consumer Action Network The University of Sydney; 2017. [http://accan.org.au/files/Grants/PeaceofMind/index.html].

  34. World Health Organization (WHO). Mental health action plan 2013-2020. Geneva: WHO; 2013. [http://apps.who.int/iris/bitstream/10665/89966/1/9789241506021_eng.pdf?ua=1].

  35. Evenstad L. Theresa May announces £67m fund for digital mental health services. Computer Weekly London, UK; 2017. January 10.

  36. Grundy QH, Wang Z, Bero LA. Challenges in Assessing Mobile Health App Quality: A Systematic Review of Prevalent and Innovative Methods. Am J Prev Med. 2016;51(6):1051–9.

    Article  PubMed  Google Scholar 

  37. Stoyanov SR, Hides L, Kavanagh DJ, Zelenko O, Tjondronegoro D, Mani M. Mobile app rating scale: a new tool for assessing the quality of health mobile apps. JMIR mHealth uHealth. 2015;3(1):e27.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Information and Privacy Commission New South Wales (IPC). Privacy Laws, Sydney. [http://www.ipc.nsw.gov.au/privacy-laws]. Accessed 13 Apr 2017.

  39. Daly A. The law and ethics of ‘self-quantified’ health information: an Australian perspective. International Data Privacy Law. 2015;5(2):144–55.

    Article  Google Scholar 

  40. Not-for-profit law. Privacy Guide: a guide to compliance with Victorian and Federal privacy laws. Melbourne: Justice Connect; 2014. [https://www.nfplaw.org.au/sites/default/files/media/Privacy_Guide_0_0_0.pdf].

    Google Scholar 

  41. Office of the Australian Information Commissioner (OAIC). Mobile apps must put user privacy first, Sydney. 2014. [https://www.oaic.gov.au/media-and-speeches/media-releases/mob-apps-must-put-privacy-first]. Accessed 9 Sept 2016.

  42. Sunyaev A, Dehling T, Taylor PL, Mandl KD. Availability and quality of mobile health app privacy policies. J Am Med Inform Assoc. 2014.

  43. Huckvale K, Prieto J, Tilney M, Benghozi P-J, Car J. Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment. BMC Medicine. 2015;13(1):214.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Ackerman L. Mobile Health and Fitness Applications and Information Privacy; Report to California Consumer Protection Foundation. Privacy Rights Clearing House, San Diego, CA. 2013. [http://www.privacyrights.org/sites/default/files/mobile-medical-apps-privacy-consumer-report.pdf].

  45. Pasquale F. The black box society: The secret algorithms that control money and information. Cambridge, MA: Harvard University Press; 2015.

  46. Martin KE. Ethical Issues in the Big Data Industry. MIS Quarterly Executive. 2015;14(2):67–85.

    Google Scholar 

  47. Lookout. Mobile app advertising guidelines. San Francsico, CA. 2012. [https://www.mylookout.com/img/images/lookout-mobile-app-advertising-guidelines.pdf]. Accessed 2 Nov 2016.

  48. The National Health and Medical Research Council (NHMRC), Australian Research Council (ARC), Committee AV-C. National statement on ethical conduct in human research. Canberra: Commonwealth of Australia; 2007. (updated May 2015).

    Google Scholar 

  49. Lookout. Lookout App Security, San Francsico, CA. [https://www.lookout.com/products/app-security]. Accessed 2 Nov 2016.

  50. Flaherty JL. Digital Diagnosis: Privacy and the Regulation of Mobile Phone Health Applications. Am J Law Med. 2014;40(4):416–41.

    PubMed  Google Scholar 

  51. Yang YT, Silverman RD. Mobile health applications: the patchwork of legal and liability issues suggests strategies to improve oversight. Health Aff (Millwood). 2014;33(2):222–7.

    Article  Google Scholar 

  52. Steinhubl SR, Muse ED, Topol EJ. The emerging field of mobile health. Sci Transl Med. 2015;7(283):283rv283.

    Article  Google Scholar 

  53. Article 29 Data Protection Working Party: Opinion 02/2013 on apps on smart devices. Brussels: European Commission. 2013. [http://ec.europa.eu/justice/dataprotection/article-29/documentation/opinion-recommendation/files/2013/wp202_en.pdf].

  54. Office of the Australian Information Commissioner (OIAC). Guide to securing personal information. Australian Government, Canberra, ACT. 2015. [https://www.oaic.gov.au/agencies-and-organisations/guides/guide-to-securing-personal-information].

  55. Department of Communications and the Arts. Online content regulation. Canberra, ACT. 2013. [https://www.communications.gov.au/policy/policy-listing/online-content-regulation]. Accessed 11 Oct 2016.

  56. Department of Communications and the Arts. Welcome to Australian Classification. Australia. 2015. [http://www.classification.gov.au/Pages/Home.aspx]. Accessed 7 Nov 2016.

  57. Australian Cybercrime Online Reporting Network (ACORN). Prohibited offensive and illegal content. Australia. [https://www.acorn.gov.au/learn-aboutcybercrime/prohibited-offensive-and-illegal-content] Accessed 7 Nov 2016.

  58. Welch HG, Schwartz LM, Woloshin S. Overdiagnosed. Making people sick inthe pursuit of health. Massachusetts: Beacon Press; 2011.

    Google Scholar 

  59. Australian Competition and Consumer Commission (ACCC). Advertising and selling guide. Canberra, ACT: ACCC; 2014. [https://www.accc.gov.au/accc-book/printerfriendly/29527].

    Google Scholar 

  60. Australian Health Practitioner Regulation Agency (AHPRA). Guidelines for advertising regulated health services. Melbourne: AHPRA; 2014. [http://www.medicalboard.gov.au/Codes-Guidelines-Policies/Guidelines-for-advertising-regulated-health-services.aspx].

  61. Therapeutic Goods Administration (TGA). Advertising health services with medical devices. Canberra, ACT. 2015. [https://www.tga.gov.au/advertising-health-servicesmedical-devices]. Accessed 13 Sept 2016.

  62. Australian Association of National Advertisers (AANA). Code of Ethics - practice note. Sydney, NSW. 2016. [http://aana.com.au/self-regulation/codes/].

  63. Australian Association of National Advertisers (AANA). Code of Ethics. Sydney, NSW. 2016. [http://aana.com.au/self-regulation/codes/].

  64. Australian Association of National Advertisers (AANA). Best practice guideline - responsible marketing communications in the digital space. Sydney, NSW. 2013. [http://aana.com.au/content/uploads/2014/05/AANA-Best-Practice-Guideline-Responsible-Marketing-Communications-in-the-Digital-Space.pdf].

  65. Australian Securities & Investments Commission (ASIC). ePayments code, Australia. 2016. [http://asic.gov.au/regulatory-resources/financial-services/epayments-code/]. Accessed 11 Oct 2016.

  66. U.S. Food and Drug Administration (FDA). General wellness: Policy for low risk devices. Rockville: U.S. Food and Drug Administration; 2016. [http://www.fda.gov/downloads/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm429674.pdf].

  67. Weckert J, Lucas R. Professionalism in the information and communication technology industry. Canberra: ANU Press; 2013.

    Book  Google Scholar 

  68. Federal Trade Commission (FTC). Marketing your mobile app: Get it right from the start. Washington, DC. 2013. [https://www.ftc.gov/tips-advice/businesscenter/guidance/marketing-your-mobile-app-get-it-right-start].

  69. Australian Communications and Media Authority (ACMA). Mobile apps: Emerging issues in media and communications. Canberra, ACT. 2013. [http://www.acma.gov.au/~/media/Regulatory%20Frameworks%20and%20International%20Engagement/Information/pdf/Mobile%20apps%20Emerging%20issues%20in%20media%20and%20communications%].

  70. International Medical Device Regulators Forum. Software as a Medical Device (SaMD): Key definitions (2013); Possible framework for risk categorization and corresponding considerations (2014); application of quality management systems (2015). [http://www.imdrf.org/workitems/wi-samd.asp]. Accessed 1 Mar 2017.

  71. Google Play. Developer Policy Center, Mountain View, CA. 2016. [https://play.google.com/about/developer-content-policy/]. Accessed 27 Oct 2016.

  72. Attorney-General’s Department. Stay smart online: Mobile devices. Canberra, ACT. 2016. [https://www.staysmartonline.gov.au]. Accessed 24 Feb 2017.

  73. Australian Communications and Media Authority (ACMA). A guide to apps & in-app purchases, Canberra, ACT. 2016. [http://www.acma.gov.au/Citizen/Phones/Mobile/Content-and-services/apps-and-in-apps-purchases-a-guide-for-consumers]. Accessed 24 Feb 2017.

  74. Health Navigator NZ. How to choose a health app. Auckland, New Zealand. 2016. [http://www.healthnavigator.org.nz/app-library/h/health-apps-how-to-choose/]. Accessed 11 Oct 2016.

  75. Health Navigator NZ. App library. Auckland, New Zealand. 2016. [http://www.healthnavigator.org.nz/app-library/]. Accessed 11 Oct 2016.

  76. Navy and Marine Corps Public Health Center (NMCPHC). Choose wisely: selecting mobile health apps. NMCPHC, Portsmouth, VA. 2014. [http://www.med.navy.mil/sites/nmcphc/Documents/program-and-policy-support/HPW-000051-Choose-Wisely-Selecting-Mobile-Health-Apps_v2.pdf].

  77. VicHealth. VicHealth’s top 10 tips for choosing a healthy living app. Carlton: VicHealth. 2015. [https://www.vichealth.vic.gov.au]. Accessed 31 Oct 2016.

  78. VicHealth. Caution needed when downloading health apps. Carlton: VicHealth. 2015. [https://www.vichealth.vic.gov.au]. Accessed 31 Oct 2016.

  79. VicHealth. Healthy Living Apps Guide. Carlton: VicHealth. 2016. [https://www.vichealth.vic.gov.au]. Accessed 31 Oct 2016.

  80. VicHealth. VicHealth ratings guide shows which health and wellbeing apps work best. Carlton: VicHealth. 2016. [https://www.vichealth.vic.gov.au]. Accessed 31 Oct 2016.

  81. VicHealth: Selecting, reviewing and rating healthy living apps - our process. Carlton: VicHealth. 2016. [https://www.vichealth.vic.gov.au]. Accessed 31 Oct 2016.

  82. VicHealth: Is there an (effective) app for that? Carlton: VicHealth; 2016. [https://www.vichealth.vic.gov.au]. Accessed 31 Oct 2016.

  83. dialogue consulting. Guidelines for creating health living apps. Melbourne: Deakin University; 2015. [https://www.vichealth.vic.gov.au/~/media/Images/VicHealth/Images%20and%20Files/MediaResources/HPApps/Guidelines-Creating-Healthy-Living-Apps.pdf].

  84. TRUSTe. TRUSTed Apps Privacy Certification. San Francisco, CA. 2016. [https://www.truste.com/business-products/trusted-apps/]. Accessed 13 Sept 2016.

  85. myhealthapps.net. my health apps: tried and tested by people like you, London. [http://myhealthapps.net/]. Accessed 11 Oct 2016.

  86. myhealthapps.net. The myhealthapps directory 2015-2016. London. 2015. [http://www.patient-view.com/uploads/6/5/7/9/6579846/the_myhealthapps_directory_2015-2016.pdf].

  87. my health apps. Health apps: Towards a balanced life. London: my health apps; 2014.

    Google Scholar 

  88. Misra S. Doctors’ guide to choosing health apps that really work. everydayhealthcom New York; 2015. July 29.

  89. PsyberGuide. PsyberGuide. Rutherford, CA. [http://psyberguide.org/]. Accessed 11 Oct 2016.

Download references

Acknowledgements

We wish to thank Andrew Ingersoll and Vanessa Halter from the Australian Digital Health Agency for advice and assistance with data collection and analysis and feedback on the manuscript.

Funding

This work was funded by a research grant from the Australian Communications Consumer Action Network (ACCAN). The operation of the Australian Communications Consumer Action Network is made possible by funding provided by the Commonwealth of Australia under section 593 of the Telecommunications Act 1997. This funding is received from charges on telecommunications carriers. The project was encouraged and supported by ACCAN, who worked with the researchers throughout the design and dissemination of the study to facilitate research translation. However, the researchers had full autonomy over study design, data collection, analysis and publication.

Availability of data and materials

All data generated or analysed during this study are available upon request from the authors.

Author information

Authors and Affiliations

Authors

Contributions

LP participated in the study design, conducted data collection and analysis and drafted the manuscript. QG conceived of the study, participated in its design, data collection and analysis and conducted major revisions. TK, DG, BM and MR participated in study design and analysis and revised the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Lisa Parker.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

Inclusion and exclusion criteria for our policy sample. Inclusion and exclusion criteria for our policy sample. (DOCX 58 kb)

Additional file 2:

App developer’s guide to law and policy. A guidance document to assist app developers create health apps that are legally compliant and in line with industry and community standards. (PDF 711 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Parker, L., Karliychuk, T., Gillies, D. et al. A health app developer’s guide to law and policy: a multi-sector policy analysis. BMC Med Inform Decis Mak 17, 141 (2017). https://doi.org/10.1186/s12911-017-0535-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12911-017-0535-0

Keywords