Skip to main content

Ethical perspectives on recommending digital technology for patients with mental illness


The digital revolution in medicine not only offers exciting new directions for the treatment of mental illness, but also presents challenges to patient privacy and security. Changes in medicine are part of the complex digital economy based on creating value from analysis of behavioral data acquired by the tracking of daily digital activities. Without an understanding of the digital economy, recommending the use of technology to patients with mental illness can inadvertently lead to harm. Behavioral data are sold in the secondary data market, combined with other data from many sources, and used in algorithms that automatically classify people. These classifications are used in commerce and government, may be discriminatory, and result in non-medical harm to patients with mental illness. There is also potential for medical harm related to poor quality online information, self-diagnosis and self-treatment, passive monitoring, and the use of unvalidated smartphone apps. The goal of this paper is to increase awareness and foster discussion of the new ethical issues. To maximize the potential of technology to help patients with mental illness, physicians need education about the digital economy, and patients need help understanding the appropriate use and limitations of online websites and smartphone apps.


Today there are many sources of big data in medicine beyond those created directly by physicians in electronic medical records (EMR). Data may be linked from imaging, pharmacy records, laboratory data, ‘omics data (large-scale genomic, metabolomic, and proteomic datasets), and administrative claims from government and private insurers (McKinsie 2011; Monteith et al. 2016a). In the future, IBM predicts that the majority of medical data will be created by patients and non-providers from health apps, patient monitoring, and from behavioral data based on the tracking of daily digital transactions (Slabodkin 2015). Important features of big data are massive size, heterogeneity, uneven quality, and the need for sophisticated automated techniques to find meaning. Already, clinical data from many provider systems are being shared in large regional or national databases to improve consistency of care, and to facilitate a wide range of medical research that increasingly involves commercial organizations (Powles 2016; IBM 2016). Both the adoption of digital apps and monitoring devices, and use of analytics on big data from diverse sources are considered the key aspects for improving healthcare and increasing cost-efficiencies (WEF 2016).

But the growth of big data and data sharing may also result in serious non-medical and medical issues for patients. The same big data technologies and analytical techniques used in medicine are also used for commercial purposes. The behavioral data acquired from the continual tracking of digital activities are sold in the secondary data market and used in algorithms that automatically classify people (Executive Office 2016; FTC 2016a). These classifications may affect many aspects of life including credit, employment, law enforcement, higher education, and pricing. Due to errors and biases embedded in data and algorithms, the non-medical impact of the classifications may be damaging to those with mental illness who already face stigmatization in society (Monteith and Glenn 2016). There are also potential medical risks to patients associated with poor quality online information, self-diagnosis and self-treatment, passive monitoring, and the use of unvalidated smartphone apps.

The goal of this paper is to increase understanding and promote discussion of the ethical issues of the digital economy that affect the treatment of patients with mental illness. Without an understanding of the digital economy, physician recommendations to patients to use technology may inadvertently lead to harm. Before discussing the ethical issues, a brief background on the digital economy, data privacy, and societal pressure to disclose information is provided.

Digital economy

The impact of big data on healthcare is part of the ongoing digitization of all major industries. Personal data are viewed as the fundamental and transformative asset class of the new digital economy and the basis for analytic decision-making (WEF 2012, 2016). Massive amounts of personal data are created and tracked from all aspects of life that involve technology, including routine daily activities such as using the Internet, social media, cell phones, smartphones, email, credit and debit cards, customer loyalty cards, posting pictures online and making mobile payments. Increasingly, large amounts of machine-generated data are produced by sensors, video cameras, license plate readers, GPS systems, E-ZPass, RFID (radio frequency identification) devices, and fitness trackers (IDC 2014). Metadata (data about data) is collected to provide context. Modern tracking techniques include sophisticated browser fingerprinting, and cookie syncing (user ID sharing) between trackers (Englehardt and Narayanan 2016). In the past, it was only profitable to collect personal data about the rich and famous (Goldfarb and Tucker 2012). The costs of data capture, storage, and distribution are now so low that it is profitable to collect personal data about everyone.

Personal data are collected by data trackers, combined with other data, analyzed, and re-sold as data products by data brokers (Martin 2015; GAO 2013; WEF 2012). A standard business model for online companies that provide free services, such as search engines and medical sites, is to track activities for behavioral advertising and sell this personal data to third parties (Goldfarb and Tucker 2011; Stark and Fins 2013; Rosenberg 2016). Commercial, governmental, and academic firms who purchase data products often recombine and re-analyze the data. Digital copies of data products can be sold endlessly. Personal data are valuable because it provides information about a person’s behavior based on the details of daily activities, thoughts, and personal connections (Pentland 2012). The value of personal data increases as the number of connections with other datasets increases.

Much of the personal data is sensitive information that is voluntarily shared by individuals, their friends, and their family (Fairfield and Engel 2015). Although metadata does not contain content, it often provides information just as sensitive as content, such as documenting regular calls to a psychiatrist’s office. Data from sources that appear harmless and unrelated may be combined to detect highly sensitive information, such as predicting sexual orientation from Facebook Likes (Kosinski et al. 2013). Firms are combining data from credit card purchases, lifestyle factors, Internet searches, and social media to recruit for clinical trials without accessing medical records (Walker 2013). Many individuals are not aware of activity tracking, and the buying and selling of their personal data (FTC 2014). Online personal data contain many errors, yet digital copies exist at different locations, making it nearly impossible to correct or permanently delete the data (PCAST 2014).

The collected data based on tracking behaviors enable automated decision-making, such as consumer profiling, risk calculation, and measurement of emotion. These algorithms broadly impact our lives in education, insurance, employment, government services, criminal justice, information filtering, real-time online marketing, pricing, and credit offers (Yulinsky 2012; Executive Office 2016; Pasquale 2015). Decision-making independent of human involvement may perpetuate long-standing inequalities and exclusions, due to errors and human biases embedded in data and algorithms (Executive Office 2016; FTC 2016a; PCAST 2014). A report from US Executive Office warns that “big data could enable new forms of discrimination and predatory practices” (Executive Office 2014), which is of particular concern to those with mental illness (Table 1). The details of most commercial and governmental algorithms are hidden from public view, leaving the public little recourse to challenge decisions (Pasquale 2011; Kerr and Earle 2013).

Table 1 Examples of automatic classification of people based on big data in the US

Algorithms based on big data are also used by criminals to target potential victims, and some people with mental illness may be especially susceptible (Monteith and Glenn 2016). Factors that increase vulnerability to online fraud include intermittent Internet use, less familiarity with technology (Sheng et al. 2010; Downs et al. 2007), high impulsivity, low attention to online cues (Mayhorn et al. 2015), and cognitive impairment (Claycomb et al. 2013).

Data privacy

One consequence of the digital economy is a loss of personal privacy (Wigan and Clarke 2013). According to Eric Schmidt, Executive Chairman of Alphabet (Google’s parent company), “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” (Saint 2010). All thoughts, ideas, pictures, emotions, priorities, and prejudices that are publicly disclosed on social media are sold (Claypoole 2014). Many experts, including the US FBI director and the CEO of Facebook, tape the camera on their laptops and smartphones to prevent video surveillance (Hern 2016). The technologies of cloud computing and mobile devices create more challenges to privacy (Benkler 2016), and big data enables very large-scale breaches (Matwin 2013). Soon, almost everything will contain an embedded chip (Internet of Things), with Cisco estimating that 37 billion intelligent things will be connected and communicating by 2020 (Evans 2013). These connected things can be controlled remotely without human intervention, collect data, make decisions, create new privacy threats (Schneier 2016; Sarma 2015), and further erode privacy in the home (Friedland 2015).

The two primary approaches to online privacy protection are notice and choice (online individual consent at websites or apps) and anonymization, but these approaches to online privacy are not effective (PCAST 2014). Individuals rarely read online consent forms (PCAST 2014). The average person would need 201 h to read the privacy policies for the websites they visit in a year (McDonald and Cranor 2008). De-identification (anonymization) techniques are increasingly defeated with high-dimensional big data (PCAST 2014). Online privacy tools are confusing and ineffective for most people (CMU 2011). Even though privacy in the era of big data is very complex, changes to the legal framework are coming, including the general data protection regulation (GDPR) to be implemented by 2018 for the EU (EU News 2016), and the EU-US Privacy Shield (EU-US 2016).

Societal pressure to disclose information

At the same time that technology is making it easy to collect massive amounts of personal data, commercial organizations are promoting self-revelation to make greater profits, and governments are promoting sharing to improve healthcare for the greater good.

Businesses that profit from collecting, analyzing, and selling personal data study online behavior and incorporate measures to encourage disclosure (Acquisti et al. 2015; Claypoole 2014; Google 2016a). People divulge information online because they are susceptible to manipulations that promote disclosure (Acquisti et al. 2015), and because it is intrinsically rewarding (Tamir and Mitchell 2012). Websites are designed with trust-building techniques that generate a sense of community and facilitate sharing, such as providing the perception of control (Siau and Shen 2003; Luo and Najdawi 2004; Brandimarte et al. 2013). Default privacy settings have a huge impact since these are rarely changed (Gross and Acquisti 2005; Acquisti et al. 2015). Reciprocity, or when the questioner offers information first, will increase responses to personal questions even when the questioner is a computer (Barak and Gluck-Ofri 2007; Fogg and Nass 1997; Harris 2016). Social media websites use reciprocity to expand contact lists and also foster activities that provide social approval such as tagging photos (Harris 2016). Other measures to promote disclosure include site registration, sweepstakes that require registration (Neus 2000), and pop-up forms to collect data before allowing task completion (Conti and Sobiesk 2010). Additionally, many respected and well-publicized leaders of technology companies are champions of changing societal norms about privacy (Johnson 2010; Noyes 2015; Gralla 2010).

Public health organizations in the US and UK promote digital tools as a means to engage patients as active participants and empower patients with information (Mostashari 2013; Gov.UK 2016). In the US, HHS envisions the use of mobile devices for continuous patient monitoring (Mostashari 2013), including for behavioral health (Wong 2013). Mobile apps are seen as a means to promote healthy lifestyles and behavioral changes (Webb et al. 2010; Dennison et al. 2013), and digital engagement is viewed as a positive attribute that will decrease healthcare costs for society (Lupton 2013). Major healthcare initiatives involve the creation of large national cohorts like the UK Biobank (0.5 million people) and the US Precision Medicine Initiative (goal of 1 million people) (Biobank 2016; White House 2015). These projects strongly emphasize data sharing, and many have plans to include mobile devices for patient monitoring and promotion of healthy behaviors (PMI 2015).

With the emphasis on data sharing by government and industry, privacy is often portrayed as an impediment to progress such as to achieving data-driven advances in healthcare (Cairns 2015; Goldfarb and Tucker 2012; Sarpatwari and Gagne 2016). Privacy regulation is often described as stifling technological innovation (Ruoff 2016). Yet, despite the pressure to disclose, people still want privacy. In the US, privacy remains important to people of all ages, including young adults aged 18–24 years (Hoofnagle et al. 2010). Ninety-two percent of Americans want the right to delete all online information (Turow et al. 2009). There is a special unease relating to disclosure of medical data. Many in the US and UK remain concerned about the privacy of data in the EMR, would like to limit sharing (Kim et al. 2015; Schwartz et al. 2015; eTRIKS 2016) and especially of sensitive data (Caine and Hanania 2013; Flynn et al. 2003; Snell 2017). Between 27–54% of patients may withhold information from a physician due to technology-related privacy concerns (Fair Warning US 2011; Fair Warning UK 2012; California HealthCare 2010). Most teenage patients with chronic illness do not disclose their health information on social media (van der Velden and El Emam 2013). In a recent international study of patients with bipolar disorder, the reason many looked online for information was because they incorrectly thought they would be anonymous (Conell et al. 2016).

Ethical issues

Given the opaque nature of the digital economy and the disruptions associated with rapidly evolving technological change, new ethical issues are arising in psychiatry from the use of technology. The classification of individuals based on big data may have long-lasting and negative non-medical impacts (Executive Office 2016; FTC 2016a). The use of unvalidated apps, medical websites with poor quality information, or self-diagnosis and self-treatment may lead to medical risks, including a delay in seeking professional help (Ryan and Wilson 2008; Armontrout et al. 2016). Traditional societal concepts of what data are public versus private data, and medical versus non-medical are blurring (Tene and Polonetsky 2013; Monteith and Glenn 2016; Friedland 2015). Without addressing the new ethical issues, physicians may inadvertently harm patients with mental illness by recommending the use of technology. To discuss these ethical issues, several questions will be posed.

Issue 1:

Should physicians recommend digital technology when patients lack technical skills and understanding of the digital economy?

Patients vary greatly in access to digital technology, technical skills, ability to safely use the Internet, and understanding of the digital economy. Disparities in Internet access, referred to as the “digital divide,” may be due to socioeconomic factors including income (Hilbert 2014), education (Cruz-Jesus et al. 2016), age (Friemel 2016), and the telecommunications infrastructure (ITU 2014). Although access has dramatically increased internationally over the last decade, Internet and smartphone use remains much lower for those with mental and physical disabilities and the elderly than for the general public (Choi and DiNitto 2013; Klee et al. 2016; Miller et al. 2016; Friemel 2016). Internet access for the poor may be intermittent and unreliable (Gonzales 2016). The digital divide is now evolving to reflect differences in technical skills, online literacy, and usage patterns, with less educated people spending more time on entertainment and less time information seeking (Büchi et al. 2015; van Deursen and Van Dijk 2014).

It is often mistakenly assumed that younger people are universally competent with technology. However, there are considerable differences in online skill levels among those who grew up surrounded by digital technologies (Hargittai 2010; ICILS 2014; Selwyn 2009). Modern digital technologies such as smartphones and video games are so widespread because they can be easily used by people without a technical background. Concepts of digital competency have evolved from an understanding of how technology works to being capable of using digital devices to achieve goals and solve tasks. People are not good at self-rating technical skills (Conell et al. 2016; Ivanitskaya et al. 2006), and even a technically skilled person who uses devices properly may not understand the increasingly interconnected digital economy.

There is no obvious way for the physician to know if a patient has sufficient knowledge of the digital economy to use technology wisely. The risk of digital data generated from the use of a smartphone app or Internet activities being used against a patient’s interest outside of medicine is real. Furthermore, there will always be significant inequalities in access and skills since technology keeps evolving, with industry creating new products and services (Hilbert 2016; Arthur 2010). Constant technological progress will always be accompanied by disparities in the diffusion and adaption of the new innovations.

Issue 2:

Can physicians ignore patient use of digital technology?

One major benefit of the Internet is the abundance of medical information, and about 3/4 of Internet users in Europe and the US seek medical information (Andreassen et al. 2007; Pew Research 2013; Bauer et al. 2016). The quality of information about mental illness on the websites ranked highly by general search engines is generally good but does vary (Grohol et al. 2014; Reavley and Jorm 2011; Monteith et al. 2013). Searching for medical information is not easy. Consumers often judge medical websites by the visual appearance (Fogg et al. 2003; Robins et al. 2010), and may accept the first answer they receive (de Freitas et al. 2013; Conell et al. 2016). In a recent study, it was hard to get answers to general mental health questions from the well-organized NIMH website (Crangle and Kart 2015). Websites usually contain introductory information about a disease, but patients often have multiple medical and psychiatric diagnoses, long-standing illness, take numerous medications, and are looking for answers about their personal situation (Conell et al. 2016; Miller 2007). Most patients do not discuss information found online with their physicians (Conell et al. 2016; Chung 2013).

The frequency of online self-diagnosis is increasing rapidly, and may be particularly attractive to those suspecting mental illness because of the stigma, a desire for privacy, and a need to save money. One-third of adults in the US use Internet resources to self-diagnose (Kuehn 2013), and there are 50 million uses yearly of the iTriage app for symptom checking and provider selection (Aetna 2013). Many websites contain symptom checkers for mental disorders. For example, the UK NHS offers online self-assessments for sleep, mood, depression, and money worries (NHS Tools 2016), and the US VA for alcohol abuse, depression, PTSD, and substance abuse (VA 2016). Symptom checkers are also found on smartphone apps (Shen et al. 2015; Lupton and Jutel 2015) and direct-to-consumer (DTC) pharmaceutical advertising websites where legal (Ebeling 2011). Patients may also receive targeted online advertising for DTC genetic and other laboratory testing (NLM 2016; AACC 2015). Diagnosis is routinely discussed in some online mental health communities (Giles and Newbold 2011). However, a study of 23 symptoms checkers (online and apps) found that the diagnostic and triage advice across a wide range of medical diagnoses was often inaccurate (Semigran et al. 2015).

Some patients who self-diagnose may then proceed to self-treat. Virtually every prescription drug can be purchased from an online pharmacy (Orizio et al. 2011). Drugs prescribed for psychiatric disorders are a leading class of drugs sold at rogue online pharmacies that do not require a prescription (Leontiadis et al. 2013). Websites for many rogue pharmacies are professionally designed, contain false quality seals, and cannot be differentiated from legitimate pharmacies solely by appearance (Monteith et al. 2016b). About 1/3 of patients with mental illness take supplement products, which are often self-selected, purchased online, associated with false advertising claims and quality problems, and may interact with prescribed medications or other supplements (OIG 2012; Bauer et al. 2015; Wu et al. 2007; O’Connor 2015). In 2015, there were over 47,000 mental health apps on sale to US consumers offering many functions (IMS 2015). Most of these apps were not validated, and only a few were tested, primarily in small, short-term pilot studies (Donker et al. 2013; Anthes 2016).

Some health websites use fraudulent tactics or promote illegal or dangerous activities. For example, Lumosity was fined for unfounded claims of cognitive enhancement from online games and apps (FTC 2016b). Some online self-tests for Alzheimer’s disease are not valid or reliable, and do not follow ethical norms for medical interventions (Robillard et al. 2015). Drugs of abuse are readily available online such as opioids (Bert et al. 2015), stimulants (Ghodse 2007), and hallucinogens (Barratt et al. 2014). Other websites intentionally promote dangerous behavior including suicide (Luxton et al. 2012) and anorexia (Borzekowski et al. 2010). Some patients even build do-it-yourself (DIY) medical devices from instructions available online, including dangerous DIY transcranial direct current stimulation devices (Greene 2016; Wurzman et al. 2016).

Physicians should assume that all patients will use digital technology at some point in the diagnosis and course of a chronic psychiatric illness. It is notable that the many of the same instruments used by physicians to screen and monitor mental illness are now available online at no cost to patients, including the scoring cutoffs. For example, many instruments are available for depression screening including the PHQ-9, Beck Depression Inventory, Duke Anxiety-Depression Scale (DADS), and the Edinburgh Postnatal Depression Scale (USPTF 2015; VA 2016; UCSF 2013; Duke University 2016; Kerr and Kerr 2001). The public now has access to physician screening tools without the knowledge and experience to interpret the results.

While many patients are not thinking about privacy while searching online (Conell et al. 2016; Libert 2015), a study of over 80,000 health-related websites found that over 90% of the websites sent information to third parties, with 70% of these including specifics on symptoms, treatments, and diseases (Libert 2015). Patients need basic information to use digital technologies with the least risk of harm, including guidance to help clarify the limits of self-diagnosis and self-treatment. A list of a small number of recommended websites should be provided to patients (Monteith et al. 2013; Conell et al. 2016).

Issue 3:

Do physicians understand mental state monitoring by commercial organizations?

With the coming of the Internet of Things, the next evolutionary step in computing is widely seen as computers reading human emotions (Pantic et al. 2007; Zeng et al. 2009; Cambria 2016). With this vision, instead of computers and devices, there will be human-centered artificial intelligence-based cognitive assistants that understand natural language, read emotions from facial expressions, voice and text, and become essential helpers throughout the day (Pantic et al. 2007; Ebling 2016; Google 2016b; Lardinois 2016). The recognition of emotion will be based on be multimodal, context-dependent systems, including facial expression and voice data (Pantic et al. 2007; Zeng et al. 2009) (Table 2). With a human–computer interface based on automated reading of emotion, users will require fewer technical skills. Personalized assistants are envisioned in medicine for both physicians and patients (Sutton 2016; Ebling 2016). There is a huge investment by the technology industry in emotion recognition. Apple, Facebook, Google, Microsoft, IBM, and Samsung were all recently awarded or applied for US patents related to inferring mood and emotion using online and smartphone data (Glenn and Monteith 2014; Brachman 2014; Kleinman 2016; Barron 2016). Today, commercial organizations and governments are routinely using algorithms based on the big data collected from the daily digital transactions to predict behavior, mental state, and to categorize and profile people (Pasquale 2015).

Table 2 Examples of technologies involved in automated emotion recognition

Academic research from various areas including computer science, linguistics, and psychology are using publicly available datasets from social media to predict mental state, including depression (Resnik et al. 2015), suicide risk (De Choudhury et al. 2016), psychopathy (Wald et al. 2012), psychological disorders (Dinakar et al. 2015), and severity of mental illness (Chancellor et al. 2016). Medical research is investigating passive data collection in humans for monitoring mental illness, with pilot studies completed for bipolar disorder (Faurholt-Jepsen et al. 2016; Gruenerbl et al. 2014; Karam et al. 2014), schizophrenia (Ben-Zeev et al. 2016; Wang et al. 2016), and depression (Saeb et al. 2015). Both the academic and medical research often use the same data elements as commercial behavioral profiling, creating parameters based on smartphone calls, app usage, text messages, smartphone sensor data on location, mobility, voice analysis, and the content of social media and text messages.

At first glance, the use of personal data for commercial profiling and medical monitoring purposes may look identical. But the motivation for using algorithms to define emotions or mental state for commercial organizations is to make money, not to help patients. Most algorithms used by commercial organizations are protected by trade secrets in the US so independent validation is not possible (Schmitz 2014). As shown with Google Flu Trends (flu tracking algorithm), the published results could not be replicated with publically available information (Lazer et al. 2014). Some commercial organizations have many more parameters for each person, and many more people in their stores of big data, and may imply they use refined versions of published algorithms. However, commercial organizations are not qualified or licensed to diagnose or dispense medical opinions or advice. If an algorithm from a commercial organization suggests a person has a “propensity to search for depression,” this information should not be treated as a medical fact, and should not impact one’s chance for employment, promotion, or credit (Pasquale 2015; Rosenblat et al. 2014).

The ability for algorithms from commercial organizations to recognize human emotions and mental states will keep improving in the future with the massive investment in this area. By 2020, the global investment in emotion detection and recognition technologies is expected to reach $22.65 billion (Marketsandmarkets 2016). There must be a clear distinction between the algorithmic findings from the practice of psychiatry, and commercial findings for profit, even though similar analytic approaches are used.

Issue 4:

What is the message to patients when physicians recommend passive monitoring of mental health?

Patients who live with a chronic mental illness develop a set of coping skills that are specific to their disease and personal living situation. The skills will differ with the disease severity, general medical health, access to resources, cultural factors, and individual attitudes. Today, the message from physicians is that patients can learn the skills to recognize and control symptoms and participate in society. Changing this message to emphasize passive monitoring and reliance on technology will be welcomed by some patients and offer opportunities to reach those who do not respond to standard approaches. However, some patients with mental illness may prefer to develop and depend on coping skills rather than passive monitoring.

Although enjoyed by some patients, several lines of evidence suggest that passive monitoring may not be of universal interest. The demographics of smartwatch and fitness tracker users in the US general public show that 2/3 of smartwatch owners are males between ages 18 and 34, and 41% of fitness trackers users have an income about double the national average (Gustafson 2015; Lubhy 2015). In studies of passive monitoring of patients with chronic medical illness, issues reported include privacy, not feeling in control, preferring existing coping mechanisms, losing dignity, and not wanting a constant reminder of their illness (Mol 2009; Storni 2014; Schüll 2016; Coughlin et al. 2007).

Patient attitudes towards passive monitoring are also important because cooperation and participation are required, even while having symptoms. Patients must be aware of routine technological issues and actions that affect the results including battery failures, turning off the smartphone, lending the smartphone to someone else, storage location such as in a purse, configuration settings such as location tracking, camera covers, being out of cell phone range, and dropped calls (Baig and Gholamhosseini 2013; Burns et al. 2011; Aranki et al. 2014).

There is considerable concern that passive monitoring tools may inadvertently increase the stigma associated with mental illness. The concept that some individuals require passive monitoring for mental stability may be easily misinterpreted by the general public, who often associate mental illness with violence (Pescosolido 2013). The situation will become worse if passive monitoring is used as a punishment, such as for non-adherence, or to facilitate the job of healthcare workers. Consider that continuous GPS monitoring is only required in the US after the release from prison of offenders who committed the most heinous crimes (CDCR 2016; Shekhter 2010). If medicine promotes passive monitoring of the mentally ill, it is important to address the reality of stigma in society, and take measures to prevent further social discrimination.

Issue 5:

Do physicians and healthcare administrators need education about the digital economy?

Physicians and healthcare administrators are a diverse group with different levels of interest in technology, but all need to have a basic understanding of the digital economy to avoid causing inadvertent harm to patients. Many are enthusiastic and regular users of technology, and are proficient at using smartphones, tablets, and apps. Some physicians see predictive algorithms based on big data from digital devices leading to dramatic improvement in patient care (Topol et al. 2015; Darcy et al. 2016). Other, especially older, physicians are not always comfortable with technology. For example, many physicians find that EMR systems are hard to use, time-consuming, and decrease the time available for patients (McDonald et al. 2014; Accenture 2015; Dünnebeil et al. 2012). While this may reflect the poor usability of some EMR products, nearly 1 in 5 US physicians employs a medical scribe who joins the doctor and patient in the examination room to enter data into an EMR system (Gillespie 2015; Gellert et al. 2015).

From a financial perspective, some view the use of widely available smartphone apps instead of traditional care and services for mental illness as a means to reduce costs. However, there is little evidence of efficacy for the numerous apps available for mental health (Shen et al. 2015; Payne et al. 2015; Huguet et al. 2016; Donker et al. 2013; Karasouli and Adams 2014; Nicholas et al. 2015; Anthes 2016).

Even enthusiastic adapters of technology may not be educated about the digital economy. It is important that physicians who recommend the use of technology to patients, and administrators who form policy for the use of technology, be aware of potential negative consequences related to tracking of personal data. Digital tools are an important and evolving part of medicine, and physicians and administrators need education with regular updates from independent sources, not vendors selling products.

Issue 6:

Should individual physicians validate smartphone apps used to make treatment decisions?

Smartphone apps that provide data used for treatment decisions should be validated. The recent experience with the UK Health Apps Library underscores the challenge. Although a new app approval process is planned for 2017 (Gov.UK 2016), studies found inadequate security in 89% of 79 accredited apps tested (Huckvale et al. 2015), and unproved clinical value in over 85% of accredited mental health apps (Leigh and Flatt 2015). A certification process must confirm that an app is not only effective and has clinical value, but must also consider real-world operation, the pathway for all data collection, sharing, storage and retention, ownership, analysis and reanalysis, and validate the specific algorithm and conclusions drawn. There are numerous technical issues relating to data security, privacy, access control, encryption, error handling, data provenance, data storage, and data transmission (Kotz et al. 2016). Other key issues include the technical support structure available to maintain and upgrade the app over time, the frequency of security recertifications, and the requirements for recertification and data ownership policy if a company is sold.

An app that collects data based on hardware components or sensors needs to be certified separately for each make and model. In today’s marketplace, one typically purchases a smartphone and then purchases an app at a later date. Consider the complexity if an app collects data from sensors. The hardware manufacturer has a set of technical specifications for each sensor, which were designed to meet the needs of a consumer smartphone, not for medical monitoring. Hardware devices contain components from many suppliers, and these will vary throughout the manufacturing life of a product model. This means that two smartphones of the same make and model purchased on the same day may contain different sensors (Asif 2015, 2016; Florin 2016) and provide slightly different data that may or may not be suitable for use in medical monitoring.

While there is no obvious solution, a certifying organization that is independent of all commercial vendors is needed to validate apps that collect data used for treatment decisions. This certifying organization must have clinical and technical expertise so that physicians can reliably recommend certified products to their patients. The certifying process must be ongoing since there are rapid changes in consumer electronics with new smartphone models appearing yearly, bringing more privacy and technical challenges. The scope of the validation problem is particularly challenging for mental health apps due to their disproportionately large number. Of the disease-specific apps available to US consumers in 2015, 29% were for mental health followed by 15% for diabetes and 8% for blood and circulatory (IMS 2015). Furthermore, the number of medical app developers is growing rapidly with an estimated 58,000 worldwide (Research 2 Guidance 2016). It is also important that patients are aware that apps that are not involved in treatment decisions, and are not certified, may have errors and may not protect patient privacy.


This discussion only provides a limited list of the ethical challenges and does not offer specific solutions to these complex problems. Many significant issues were omitted such as how patient monitoring systems handle data that are inadvertently captured about other people such as facial images, voice recordings, and metadata (Rana et al. 2016), and new legal issues such as timeliness of response to monitoring data (Armontrout et al. 2016). Other issues that were omitted include whether health-related chatbots (automated conversational software) should deceive patients into thinking they are interacting with a human (Whitby 2014), the coming of medications with sensors for adherence monitoring (Kane et al. 2013), the monitoring of people with dementia (Niemeijer et al. 2011), and the evaluation of long-term clinical value.

The challenges related to the adoption of new technologies including operational and technical issues, and the threats of malicious hacking into every electronic device and system used by patients and providers were not included. Provider responsibility for securing medical data was not discussed even though breaches in the US involved over 113 million records in the year 2015 (GAO 2016). The productivity paradox associated with new technologies, such that increased productivity and cost savings require an expensive multiyear process reengineering effort, was omitted (Jones et al. 2012; Brynjolfsson and Hitt 1998; Katz et al. 2012). Finally, there was no discussion of automation bias (unthinking reliance on technology) in relation to patient monitoring, which may be of concern given the quality of many sensors used in smartphones and wearables (Puentes et al. 2013; Baig and Gholamhosseini 2013; Banaee et al. 2013; Burns et al. 2011; Meltzer et al. 2015; Goode 2016).

Conclusions and future directions

In the future, physicians will have to address technology issues to provide quality care to their patients. The digital revolution in medicine offers exciting new directions for the treatment of mental illness including online psychotherapy, tools to support medication adherence, telemedicine, and research based on linked medical records. Along with these opportunities come extraordinary complex challenges to privacy and security as part of the digital economy. There are a variety of new ethical issues facing physicians in relation to recommending the use of technology. Commercial activities such as profiling of behavior and mental state pose major non-medical concerns for patients with mental illness. The use of unvalidated apps, poor quality online information, self-diagnosis and self-treatment, and unique problems with passive monitoring pose major medical concerns. To maximize the potential of technology to help patients with mental illness, physicians need education about the basics of the digital economy, and must help patients to understand the limits and benefits.


  1. AACC (American Association of Clinical Chemistry). Direct-to-consumer laboratory testing. 2015. Accessed 8 Oct 2016.

  2. Accenture. Accenture doctors survey 2015: healthcare IT pain and progress. 2015. Accessed 8 Oct 2016.

  3. Acquisti A, Varian HR. Conditioning prices on purchase history. Mark Sci. 2005;24:367–81.

    Article  Google Scholar 

  4. Acquisti A, Brandimarte L, Loewenstein G. Privacy and human behavior in the age of information. Science. 2015;347:509–14.

    CAS  PubMed  Article  Google Scholar 

  5. Aetna. Aetna brings new iTriage employer technology to mid-sized businesses. 2013. Accessed 8 Oct 2016.

  6. Andreassen HK, Bujnowska-Fedak MM, Chronaki CE, Dumitru RC, Pudule I, Santana S, et al. European citizens’ use of E-health services: a study of seven countries. BMC Public Health. 2007;7:1.

    Article  Google Scholar 

  7. Anthes E. Mental health: there’s an app for that. Nature. 2016;532:20–3.

    CAS  PubMed  Article  Google Scholar 

  8. Aranki D, Kurillo G, Yan P, Liebovitz DM, Bajcsy R. Continuous, real-time, tele-monitoring of patients with chronic heart-failure: lessons learned from a pilot study. In: Proceedings of the 9th international conference on body area networks. ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering); 2014. p. 135–41.

  9. Armontrout J, Torous J, Fisher M, Drogin E, Gutheil T. Mobile mental health: navigating new rules and regulations for digital tools. Curr Psychiatry Rep. 2016;18:91.

    PubMed  Article  Google Scholar 

  10. Arthur WB. What is technology and how does it evolve? New York Academy of Sciences Magazine. 2010. Accessed 8 Oct 2016.

  11. Asif S. Here’s the difference between Samsung and Sony camera sensors on the Galaxy S6 and S6 edge. 2015. Accessed 8 Oct 2016.

  12. Asif S. Like the Galaxy S6, Samsung is using two different camera sensors for the Galaxy S7 duo. 2016. Accessed 8 Oct 2016.

  13. Baig MM, Gholamhosseini H. Smart health monitoring systems: an overview of design and modeling. J Med Syst. 2013;37:9898.

    PubMed  Article  Google Scholar 

  14. Banaee H, Ahmed MU, Loutfi A. Data mining for wearable sensors in health monitoring systems: a review of recent trends and challenges. Sensors. 2013;13:17472–500.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  15. Barak A, Gluck-Ofri O. Degree and reciprocity of self-disclosure in online forums. CyberPsychol Behav. 2007;10:407–17.

    PubMed  Article  Google Scholar 

  16. Barratt MJ, Ferris JA, Winstock AR. Use of Silk Road, the online drug marketplace, in the United Kingdom, Australia and the United States. Addiction. 2014;109:774–83.

    PubMed  Article  Google Scholar 

  17. Barron E. Google wants to take your temperature and count your heartbeat. 2016. Accessed 8 Oct 2016.

  18. Batty M, Tripathi A, Kroll A, Wu C, Moore D, Stehno C, et al. Predictive modeling for life insurance. Deloitte Consulting LLP. 2010. Accessed 8 Oct 2016.

  19. Bauer M, Glenn T, Conell J, Rasgon N, Marsh W, Sagduyu K, et al. Common use of dietary supplements for bipolar disorder: a naturalistic, self-reported study. Int J Bipolar Disord. 2015;3:1–7.

    Article  CAS  Google Scholar 

  20. Bauer R, Conell J, Glenn T, Alda M, Ardau R, Baune BT, et al. Internet use by patients with bipolar disorder: results from an international multisite survey. Psychiatry Res. 2016;242:388–94.

    PubMed  Article  Google Scholar 

  21. Benkler Y. Degrees of freedom, dimensions of power. Daedalus. 2016;145:18–32.

    Article  Google Scholar 

  22. Ben-Zeev D, Wang R, Abdullah S, Brian R, Scherer EA, Mistler LA, et al. Mobile behavioral sensing for outpatients and inpatients with schizophrenia. Psychiatr Serv. 2016;67:558–61.

    PubMed  Article  Google Scholar 

  23. Bert F, Galis V, Passi S, Rosaria Gualano M, Siliquini R. Differences existing between USA and Europe in opioids purchase on Internet: an interpretative review. J Subst Use. 2015;20:200–7.

    Article  Google Scholar 

  24. Biobank. About UK Biobank. Accessed 8 Oct 2016.

  25. Borzekowski DL, Schenk S, Wilson JL, Peebles R. e-Ana and e-Mia: a content analysis of pro-eating disorder Web sites. Am J Public Health. 2010;100:1526–34.

    PubMed  PubMed Central  Article  Google Scholar 

  26. Brachman S. IBM seeks patent on software that incorporates human emotion. IP Watchdog. 2014. Accessed 8 Oct 2016.

  27. Brandimarte L, Acquisti A, Loewenstein G. Misplaced confidences privacy and the control paradox. Soc Psychol Personal Sci. 2013;4:340–7.

    Article  Google Scholar 

  28. Brynjolfsson E, Hitt LM. Beyond the productivity paradox. Commun ACM. 1998;41:49–55.

    Article  Google Scholar 

  29. Büchi M, Just N, Latzer M. Modeling the second-level digital divide: a five-country study of social differences in Internet use. New Media Soc. 2015;9:1461444815604154.

    Google Scholar 

  30. Burns MN, Begale M, Duffecy J, Gergle D, Karr CJ, Giangrande E, et al. Harnessing context sensing to develop a mobile intervention for depression. J Med Internet Res. 2011;13:e55.

    PubMed  PubMed Central  Article  Google Scholar 

  31. Caine K, Hanania R. Patients want granular privacy control over health information in electronic medical records. Am Med Inform Assoc. 2013;20:7–15.

    Article  Google Scholar 

  32. Cairns A. Op-Ed: privacy concerns jeopardise healthcare innovation. Digit Journal. 2015. Accessed 8 Oct 2016.

  33. California HealthCare Foundation. Consumers and health information technology: a national survey. 2010. Accessed 8 Oct 2016.

  34. Calvo RA, D’Mello S. Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput. 2010;1:18–37.

    Article  Google Scholar 

  35. Cambria E. Affective computing and sentiment analysis. IEEE Intell Syst. 2016;31:102–7.

    Article  Google Scholar 

  36. CDCR (California Department of Corrections and Rehabilitation). Electronic monitoring. 2016. Accessed 8 Oct 2016.

  37. Chancellor S, Lin Z, Goodman EL, Zerwas S, De Choudhury M. Quantifying and predicting mental illness severity in online pro-eating disorder communities. In: Proceedings of the 19th ACM conference on computer-supported cooperative work & social computing. ACM; 2016. p. 1171–84.

  38. Choi NG, DiNitto DM. The digital divide among low-income homebound older adults: internet use patterns, eHealth literacy, and attitudes toward computer/Internet use. J Med Internet Res. 2013;15:e93.

    PubMed  PubMed Central  Article  Google Scholar 

  39. Christovich MM. Why should we care what Fitbit shares?: a proposed statutory solution to protect sensitive personal fitness information. Hastings Commun Entertain Law J. 2016;38:91.

    Google Scholar 

  40. Chung JE. Patient–provider discussion of online health information: results from the 2007 Health Information National Trends Survey (HINTS). J Health Commun. 2013;18:627–48.

    PubMed  Article  Google Scholar 

  41. Claycomb M, Black AC, Wilber C, Brocke S, Lazar CM, Rosen MI. Financial victimization of adults with severe mental illness. Psychiatr Serv. 2013;64:918–20.

    PubMed  PubMed Central  Article  Google Scholar 

  42. Claypoole T. Privacy and social media. ABA Business Law Today. 2014. Accessed 8 Oct 2016.

  43. CMU (Carnegie Mellon University). Carnegie Mellon report finds Internet privacy tools are confusing, ineffective for most people. 2011. Accessed 8 Oct 2016.

  44. Conell J, Bauer R, Glenn T, Alda M, Ardau R, Baune BT, et al. Online information seeking by patients with bipolar disorder: results from an international multisite survey. Int J Bipolar Disord. 2016;4:17.

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  45. Conti G, Sobiesk E. Malicious interface design: exploiting the user. In: Proceedings of the 19th international conference on World Wide Web. ACM; 2010. p. 271–80.

  46. Coughlin JF, D’Ambrosio LA, Reimer B, Pratt MR. Older adult perceptions of smart home technologies: Implications for research, policy & market innovations in healthcare. In: 2007 29th annual international conference of the IEEE engineering in medicine and biology society. IEEE; 2007. p. 1810–15.

  47. Crangle CE, Kart JB. A questions-based investigation of consumer mental-health information. PeerJ. 2015;3:e867.

    PubMed  PubMed Central  Article  Google Scholar 

  48. Cruz-Jesus F, Vicente MR, Bacao F, Oliveira T. The education-related digital divide: an analysis for the EU-28. Comput Human Behav. 2016;56:72–82.

    Article  Google Scholar 

  49. Darcy AM, Louie AK, Roberts LW. Machine learning and the profession of medicine. JAMA. 2016;315:551–2.

    CAS  PubMed  Article  Google Scholar 

  50. De Choudhury M, Kiciman E, Dredze M, Coppersmith G, Kumar M. Discovering shifts to suicidal ideation from mental health content in social media. In: Proceedings of the 2016 CHI conference on human factors in computing systems. ACM; 2016. p. 2098–110.

  51. De Freitas J, Falls BA, Haque OS, Bursztajn HJ. Vulnerabilities to misinformation in online pharmaceutical marketing. J R Soc Med. 2013;106:184–9.

    PubMed  PubMed Central  Article  Google Scholar 

  52. Dennison L, Morrison L, Conway G, Yardley L. Opportunities and challenges for smartphone applications in supporting health behavior change: qualitative study. J Med Internet Res. 2013;15:e86.

    PubMed  PubMed Central  Article  Google Scholar 

  53. Dinakar S, Andhale P, Rege M. Sentiment analysis of social network content. In: 2015 international conference on information reuse and integration (IRI). IEEE; 2015. p. 189–92.

  54. Dixon P, Gellman B. The scoring of America: how secret consumer scores threaten your privacy and your future. World Privacy Forum. 2014. Accessed 8 Oct 2016.

  55. Donker T, Petrie K, Proudfoot J, Clarke J, Birch MR, Christensen H. Smartphones for smarter delivery of mental health programs: a systematic review. J Med Internet Res. 2013;15:e247.

    PubMed  PubMed Central  Article  Google Scholar 

  56. Downs JS, Holbrook M, Cranor LF. Behavioral response to phishing risk. In: Proceedings of the anti-phishing working groups 2nd annual eCrime researchers summit. ACM; 2007. p. 37–44.

  57. Duke University. Duke health measures. Duke anxiety-depression scale (Duke-AD). 2016. Accessed 8 Oct 2016.

  58. Dünnebeil S, Sunyaev A, Blohm I, Leimeister JM, Krcmar H. Determinants of physicians’ technology acceptance for e-health in ambulatory care. Int J Med Inform. 2012;81:746–60.

    PubMed  Article  Google Scholar 

  59. Ebeling M. ‘Get with the program!’: pharmaceutical marketing, symptom checklists and self-diagnosis. Soc Sci Med. 2011;73:825–32.

    PubMed  Article  Google Scholar 

  60. Ebling MR. Can cognitive assistants disappear? IEEE Pervasive Comput. 2016;15:4–6.

    Article  Google Scholar 

  61. El Ayadi M, Kamel MS, Karray F. Survey on speech emotion recognition: features, classification schemes, and databases. Pattern Recognit. 2011;44:572–87.

    Article  Google Scholar 

  62. Englehardt S, Narayanan A. Online tracking: A 1-million-site measurement and analysis. 2016. Princeton Web Census. Accessed 8 Oct 2016.

  63. eTRIKS. People ‘don’t trust’ NHS with personal data, survey says. 2016. Accessed 18 Oct 2016.

  64. European Parliament News. Q&A: new EU rules on data protection put the citizen back in the driving seat. 2016. Accessed 8 Oct 2016.

  65. EU-US Privacy Shield Fact Sheet. European Commission. 2016. Accessed 8 Oct 2016.

  66. Evans D. Thanks to IoE, the next decade looks positively ‘nutty’. Cisco Blog. 2013. Accessed 8 Oct 2016.

  67. Executive Office of the President. Big data: seizing opportunities, preserving values. 2014. Accessed 8 Oct 2016.

  68. Executive Office of the President. Big data: a report on algorithmic systems, opportunity, and civil rights. 2016. Accessed 8 Oct 2016.

  69. Fair Warning. How privacy considerations drive patient decisions and impact patient care outcomes. 2011. Accessed 8 Oct 2016.

  70. Fair Warning. UK Patient privacy survey. 2012. Accessed 8 Oct 2016.

  71. Fairfield JA, Engel C. Privacy as a public good. Duke Law J. 2015;65:385–569.

    Google Scholar 

  72. Faurholt-Jepsen M, Vinberg M, Frost M, Debel S, Margrethe Christensen E, Bardram JE, et al. Behavioral activities collected through smartphones and the association with illness activity in bipolar disorder. Int J Methods Psychiatr Res. 2016;25:309–23.

    PubMed  Article  Google Scholar 

  73. Fertik M. The rich see a different internet than the poor. Scientific American. 2013. Accessed 8 Oct 2016.

  74. Florin T. iPhone 7 and iPhone 7 Plus may both offer OIS, LG and Sony could share camera module orders. 2016. Accessed 8 Oct 2016.

  75. Flynn HA, Marcus SM, Kerber K, Alessi N. Patients’ concerns about and perceptions of electronic psychiatric records. Psychiatr Serv. 2003;54:1539–41.

    PubMed  Article  Google Scholar 

  76. Fogg BJ, Nass C. How users reciprocate to computers: an experiment that demonstrates behavior change. In: CHI’97 extended abstracts on human factors in computing systems. ACM; 1997. p. 331–32.

  77. Fogg BJ, Soohoo C, Danielson DR, Marable L, Stanford J, Tauber ER. How do users evaluate the credibility of web sites?: a study with over 2,500 participants. In: Proceedings of the 2003 conference on designing for user experiences. ACM; 2003 p. 1–15.

  78. Friedland SI. I spy: the new self-cybersurveillance and the “Internet of Things”. Wash Lee Law Rev. 2015;72:1459–501.

    Google Scholar 

  79. Friemel TN. The digital divide has grown old: determinants of a digital divide among seniors. New Media Soc. 2016;18:313–31.

    Article  Google Scholar 

  80. FTC (Federal Trade Commission). Data brokers: a call for transparency and accountability. 2014. Accessed 8 Oct 2016.

  81. FTC (Federal Trade Commission). Big data: a tool for inclusion or exclusion? Understanding the issues (FTC Report). 2016a. Accessed 8 Oct 2016.

  82. FTC. Lumosity to pay $2 million to settle FTC deceptive advertising charges for its “brain training” program. 2016b. Accessed 8 Oct 2016.

  83. GAO. HHS needs to strengthen security and privacy guidance and oversight. 2016. Accessed 8 Oct 2016.

  84. GAO (Government Accountability Office). Information resellers: consumer privacy framework needs to reflect changes in technology and the marketplace. 2013. Accessed 8 Oct 2016.

  85. Gellert GA, Ramirez R, Webster SL. The rise of the medical scribe industry: implications for the advancement of electronic health records. JAMA. 2015;313:1315–6.

    CAS  PubMed  Article  Google Scholar 

  86. Ghodse H. ‘Uppers’ keep going up. Br J Psychiatry. 2007;191:279–81.

    PubMed  Article  Google Scholar 

  87. Giles DC, Newbold J. Self-and other-diagnosis in user-led mental health online communities. Qual Health Res. 2011;21:419–28.

    PubMed  Article  Google Scholar 

  88. Gillespie L. Jobs for medical scribes are rising rapidly but standards lag. Kaiser Health News. 2015. Accessed 8 Oct 2016.

  89. Glenn T, Monteith S. New measures of mental state and behavior based on data collected from sensors, smartphones, and the Internet. Curr Psychiatry Rep. 2014;16:523.

    PubMed  Article  Google Scholar 

  90. Goldfarb A, Tucker CE. Online advertising, behavioral targeting, and privacy. Commun ACM. 2011;54:25–7.

    Article  Google Scholar 

  91. Goldfarb A, Tucker C. Privacy and innovation. In: Lerner J, Stern S, editors. Innovation policy and the economy, vol. 12. Chicago: University of Chicago Press; 2012. p. 65–89.

    Google Scholar 

  92. Gonzales A. The contemporary US digital divide: from initial access to technology maintenance. Inf Commun Soc. 2016;19:234–48.

    Article  Google Scholar 

  93. Goode L. Fitbit hit with class-action suit over inaccurate heart rate monitoring. The Verge. 2016. Accessed 8 Oct 2016.

  94. Google. Google analytics help. Overview of content experiments. 2016a. Accessed 8 Oct 2016.

  95. Google. This year’s founders’ letter. 2016b. Accessed 8 Oct 2016.

  96. Gov.UK. New plans to expand the use of digital technology across the NHS. UK Department of Health. 2016. Accessed 8 Oct 2016.

  97. Gralla P. Google CEO Schmidt: we can know everything about you. Computerworld. 2010.–we-can-know-everything-about-you.html. Accessed 8 Oct 2016.

  98. Greene JA. Do-it-yourself medical devices—technology and empowerment in American health care. N Engl J Med. 2016;374:305–8.

    PubMed  Article  Google Scholar 

  99. Grohol JM, Slimowicz J, Granda R. The quality of mental health information commonly searched for on the Internet. Cyberpsychol Behav Soc Netw. 2014;17:216–21.

    PubMed  Article  Google Scholar 

  100. Gross R, Acquisti A. Information revelation and privacy in online social networks. In: Proceedings of the 2005 ACM workshop on privacy in the electronic society. ACM; 2005. p. 71–80.

  101. Gruenerbl A, Osmani V, Bahle G, Carrasco JC, Oehler S, Mayora O, et al. Using smart phone mobility traces for the diagnosis of depressive and manic episodes in bipolar patients. In: Proceedings of the 5th augmented human international conference. ACM; 2014. p. 38.

  102. Gustafson K. Smartwatch or fitness tracker? Why age, sex matter. CNBC. 2015. Accessed 8 Oct 2016.

  103. Hargittai E. Digital na (t) ives? Variation in internet skills and uses among members of the “net generation”. Sociol Inq. 2010;80:92–113.

    Article  Google Scholar 

  104. Harris T. How technology hijacks people’s minds — from a magician and Google’s design ethicist. 2016. Accessed 8 Oct 2016.

  105. Hern A. Mark Zuckerberg tapes over his webcam. Should you? The Guardian. 2016. Accessed 8 Oct 2016.

  106. Hilbert M. Technological information inequality as an incessantly moving target: the redistribution of information and communication capacities between 1986 and 2010. J Assoc Inf Sci Technol. 2014;65:821–35.

    Article  Google Scholar 

  107. Hilbert M. The bad news is that the digital access divide is here to stay: domestically installed bandwidths among 172 countries for 1986–2014. Telecomm Policy. 2016;40:567–81.

    Article  Google Scholar 

  108. Hoofnagle CJ, King J, Li S, Turow J. How different are young adults from older adults when it comes to information privacy attitudes and policies? 2010. Accessed 8 Oct 2016.

  109. Huckvale K, Prieto JT, Tilney M, Benghozi PJ, Car J. Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment. BMC Med. 2015;13:214.

    PubMed  PubMed Central  Article  Google Scholar 

  110. Huguet A, Rao S, McGrath PJ, Wozney L, Wheaton M, Conrod J, et al. A systematic review of cognitive behavioral therapy and behavioral activation apps for depression. PLoS ONE. 2016;11:e0154248.

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  111. IBM. IBM Watson health announces plans to acquire Truven Health Analytics for $2.6b, extending its leadership in value-based care solutions. 2016. Accessed 8 Oct 2016.

  112. ICILS (The International Computer and Information Literacy Study). Main findings and implications for education policies in Europe. European commission. 2014. Accessed 8 Oct 2016.

  113. IDC. The digital universe of opportunities: rich data and the increasing value of the Internet of Things. 2014. Accessed 8 Oct 2016.

  114. IMS. Patient adoption of mHealth. 2015. Accessed 8 Oct 2016.

  115. ITU (International Telecommunications Union). Measuring the information society 2014. 2014. Accessed 8 Oct 2016.

  116. Ivanitskaya L, O’Boyle I, Casey AM. Health information literacy and competencies of information age students: results from the interactive online research readiness self-assessment (RRSA). J Med Internet Res. 2006;8(2):e6.

    PubMed  PubMed Central  Article  Google Scholar 

  117. Jain AK, Duin RP, Mao J. Statistical pattern recognition: a review. IEEE Trans Pattern Anal Mach Intell. 2000;22:4–37.

    Article  Google Scholar 

  118. Jerritta S, Murugappan M, Nagarajan R, Wan K. Physiological signals based human emotion recognition: a review. In: Signal processing and its applications (CSPA), 2011 IEEE 7th international colloquium. IEEE; 2011. p. 410–15.

  119. Johnson N. Privacy no longer a social norm, says Facebook founder. The Guardian. 2010. Accessed 8 Oct 2016.

  120. Jones SS, Heaton PS, Rudin RS, Schneider EC. Unraveling the IT productivity paradox–lessons for health care. N Engl J Med. 2012;366:2243–5.

    CAS  PubMed  Article  Google Scholar 

  121. Kane JM, Perlis RH, DiCarlo LA, Au-Yeung K, Duong J, Petrides G. First experience with a wireless system incorporating physiologic assessments and direct confirmation of digital tablet ingestions in ambulatory patients with schizophrenia or bipolar disorder. J Clin Psychiatry. 2013;74:e533–40.

    PubMed  Article  Google Scholar 

  122. Karam ZN, Provost EM, Singh S, Montgomery J, Archer C, Harrington G, et al. Ecologically valid long-term mood monitoring of individuals with bipolar disorder using speech. In: 2014 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE; 2014. p. 4858–62.

  123. Karasouli E, Adams A. Assessing the evidence for e-resources for mental health self-management: a systematic literature review. JMIR Ment Health. 2014;1:e3.

    PubMed  PubMed Central  Article  Google Scholar 

  124. Katz R, Mesfin T, Barr K. Lessons from a community-based mHealth diabetes self-management program: “it’s not just about the cell phone”. J Health Commun. 2012;17(Suppl 1):67–72.

    PubMed  Article  Google Scholar 

  125. Kerr I, Earle J. Prediction, preemption, presumption: how big data threatens big picture privacy. Stanf Law Rev Online. 2013;66:65.

    Google Scholar 

  126. Kerr LK, Kerr LD. Screening tools for depression in primary care. Beck depression inventory. West J Med. 2001;175:349–52.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  127. Kim KK, Joseph JG, Ohno-Machado L. Comparison of consumers’ views on electronic data sharing for healthcare and research. J Am Med Inform Assoc. 2015;22:821–30.

    PubMed  PubMed Central  Article  Google Scholar 

  128. Klee A, Stacy M, Rosenheck R, Harkness L, Tsai J. Interest in technology-based therapies hampered by access: a survey of veterans with serious mental illnesses. Psychiatr Rehabil J. 2016;39:173–9.

    PubMed  Article  Google Scholar 

  129. Kleinman J. Facebook wants to turn your face into emoji. TechnoBuffalo. 2016. Accessed 8 Oct 2016.

  130. Kleinsmith A, Bianchi-Berthouze N. Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput. 2013;4:15–33.

    Article  Google Scholar 

  131. Kosinski M, Stillwell D, Graepel T. Private traits and attributes are predictable from digital records of human behavior. Proc Natl Acad Sci USA. 2013;110:5802–5.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  132. Kotz D, Gunter CA, Kumar S, Weiner JP. Privacy and security in mobile health: a research agenda. Computer. 2016;49:22–30.

    PubMed  PubMed Central  Article  Google Scholar 

  133. Kuehn BM. More than one-third of US individuals use the Internet to self-diagnose. JAMA. 2013;309:756–7.

    CAS  PubMed  Article  Google Scholar 

  134. Lardinois F. Microsoft CEO Satya Nadella on how AI will transform his company. 2016. Accessed 8 Oct 2016.

  135. Lazer D, Kennedy R, King G, Vespignani A. Big data. The parable of Google Flu: traps in big data analysis. Science. 2014;343:1203–5.

    CAS  PubMed  Article  Google Scholar 

  136. Leigh S, Flatt S. App-based psychological interventions: friend or foe? Evid Based Ment Health. 2015;18:97–9.

    PubMed  Article  Google Scholar 

  137. Leontiadis N, Moore T, Christin N. Pick your poison: pricing and inventories at unlicensed online pharmacies. In Proceedings of the fourteenth ACM conference on electronic commerce. ACM; 2013. p. 621–38.

  138. Libert T. Privacy implications of health information seeking on the web. Commun ACM. 2015;58:68–77.

    Article  Google Scholar 

  139. LiKamWa R, Liu Y, Lane ND, Zhong L. MoodScope: building a mood sensor from smartphone usage patterns. In: Proceeding of the 11th annual international conference on mobile systems, applications, and services. ACM; 2013. p. 389–402.

  140. Liu B. Sentiment analysis and subjectivity. In: Indurkhya N, Damerau FJ, editors. Handbook of natural language processing. Boca Raton: CRC Press; 2010. p. 627–66.

    Google Scholar 

  141. Lubhy T. Typical American family earned $53,657 last year. CNN Money. 2015. Accessed 8 Oct 2016.

  142. Luo W, Najdawi M. Trust-building measures: a review of consumer health portals. Commun ACM. 2004;47:108–13.

    Article  Google Scholar 

  143. Lupton D. The digitally engaged patient: self-monitoring and self-care in the digital health era. Soc Theory Health. 2013;11:256–70.

    Article  Google Scholar 

  144. Lupton D, Jutel A. ‘It’s like having a physician in your pocket!’A critical analysis of self-diagnosis smartphone apps. Soc Sci Med. 2015;133:128–35.

    PubMed  Article  Google Scholar 

  145. Luxton DD, June JD, Fairall JM. Social media and suicide: a public health perspective. Am J Public Health. 2012;102(Suppl 2):S195–200.

    PubMed  PubMed Central  Article  Google Scholar 

  146. Marketsandmarkets. Emotion detection and recognition market by technology (bio-sensors, nlp, machine learning, and others), software tools (facial expression, voice recognition and others), services, application areas, end users, and regions—global forecast to 2020. 2016. Accessed 8 Oct 2016.

  147. Martin KE. Ethical issues in the big data industry. MIS Q Exec. 2015;14:2.

    Google Scholar 

  148. Mattioli D. On Orbitz, Mac users steered to pricier hotels. Wall Street Journal. 2012. Accessed 8 Oct 2016.

  149. Matwin S. Q&A: Dr. Stan Matwin, Dalhousie University. Faculty of Computer Science News. 2013. Accessed 8 Oct 2016.

  150. Mayhorn CB, Murphy-Hill E, Zielinska OA, Welk AK. The social engineering behind phishing. The Next Wave. 2015. Accessed 8 Oct 2016.

  151. McDonald AM, Cranor LF. Cost of reading privacy policies. ISJLP. 2008;4:543.

    Google Scholar 

  152. McDonald CJ, Callaghan FM, Weissman A, Goodwin RM, Mundkur M, Kuhn T. Use of internist’s free time by ambulatory care electronic medical record systems. JAMA Intern Med. 2014;174:1860–3.

    PubMed  Article  Google Scholar 

  153. McKinsie Global Institute. Big data: the next frontier for innovation, competition, and productivity. 2011. Accessed 8 Oct 2016.

  154. McPherson R, Shokri R, Shmatikov V. Defeating image obfuscation with deep learning. 2016. Accessed 8 Oct 2016.

  155. Meltzer LJ, Hiruma LS, Avis K, Montgomery-Downs H, Valentin J. Comparison of a commercial accelerometer with polysomnography and actigraphy in children and adolescents. Sleep. 2015;38:1323–30.

    PubMed  PubMed Central  Article  Google Scholar 

  156. Meng J, Zhang J, Zhao H. Overview of the speech recognition technology. In: 2012 Fourth international conference on computational and information sciences (ICCIS). IEEE; 2012. p. 199–202.

  157. Miller N. Analysis of user messages to J Med Libr Assoc. 2007;95:81–3.

    PubMed  PubMed Central  Google Scholar 

  158. Miller CJ, McInnes DK, Stolzmann K, Bauer MS. Interest in use of technology for healthcare among veterans receiving treatment for mental health. Telemed J e-Health. 2016;22:847–54.

    PubMed  Article  Google Scholar 

  159. Mitra S, Acharya T. Gesture recognition: a survey. IEEE Trans Syst Man Cybern C Appl Rev. 2007;37:311–24.

    Article  Google Scholar 

  160. Mol A. Living with diabetes: care beyond choice and control. Lancet. 2009;373:1756–7.

    PubMed  Article  Google Scholar 

  161. Monteith S, Glenn T. Automated decision-making and big data: concerns for people with mental illness. Curr Psychiatry Rep. 2016;18:112.

    PubMed  Article  Google Scholar 

  162. Monteith S, Glenn T, Bauer M. Searching the internet for health information about bipolar disorder: some cautionary issues. Int J Bipolar Disord. 2013;1:22.

    PubMed  PubMed Central  Article  Google Scholar 

  163. Monteith S, Glenn T, Geddes J, Whybrow PC, Bauer M. Big data for bipolar disorder. Int J Bipolar Disord. 2016a;4:1.

    Article  Google Scholar 

  164. Monteith S, Glenn T, Bauer R, Conell J, Bauer M. Availability of prescription drugs for bipolar disorder at online pharmacies. J Affect Disord. 2016b;193:59–65.

    PubMed  Article  Google Scholar 

  165. Mostashari F. Statement on HIT. 2013. Accessed 8 Oct 2016.

  166. Nadkarni PM, Ohno-Machado L, Chapman WW. Natural language processing: an introduction. J Am Med Inform Assoc. 2011;18:544–51.

    PubMed  PubMed Central  Article  Google Scholar 

  167. Neus A. The quality of online registration information. In: Proceedings of the 2000 international conference on information quality. MIT Information Quality (MITIQ) Program. Accessed 8 Oct 2016.

  168. NHS Tools. Self assessments. 2016. Accessed 8 Oct 2016.

  169. Nicholas J, Larsen ME, Proudfoot J, Christensen H. Mobile apps for bipolar disorder: a systematic review of features and content quality. J Med Internet Res. 2015;17:e198.

    PubMed  PubMed Central  Article  Google Scholar 

  170. Niemeijer AR, Frederiks BJ, Depla MF, Legemaate J, Eefsting JA, Hertogh CM. The ideal application of surveillance technology in residential care for people with dementia. J Med Ethics. 2011;37:303–10.

    PubMed  Article  Google Scholar 

  171. NLM. What is direct-to-consumer genetic testing? Aug 8, 2016. Accessed 8 Oct 2016.

  172. Noyes K. Scott McNealy on privacy: you still don’t have any. PC World. 2015. Accessed 8 Oct 2016.

  173. O’Connor A. New York attorney general targets supplements at major retailers. New York Times. 2015. Accessed 8 Oct 2016.

  174. OIG (US Office of the Inspector General). Dietary supplements: structure/function claims fail to meet federal requirements. 2012 Report (OEI-01-11-00210). Accessed 8 Oct 2016.

  175. Orizio G, Merla A, Schulz PJ, Gelatti U. Quality of online pharmacies and websites selling prescription drugs: a systematic review. J Med Internet Res. 2011;13(3):e74.

    PubMed  PubMed Central  Article  Google Scholar 

  176. Pantic M, Pentland A, Nijholt A, Huang TS. Human computing and machine understanding of human behavior: a survey. In: Huang TS, Nijholt A, Pantic M, Pentland A, editors. Artificial intelligence for human computing. Berlin Heidelberg: Springer; 2007. p. 47–71.

    Chapter  Google Scholar 

  177. Pariser E. The filter bubble: How the new personalized web is changing what we read and how we think. New York: Penguin Group (USA); 2011.

    Google Scholar 

  178. Pasquale FA. Restoring transparency to automated authority. 2011. Accessed 8 Oct 2016.

  179. Pasquale F. The black box society. The secret algorithms that control money and information. Cambridge: Harvard University Press; 2015.

    Book  Google Scholar 

  180. Payne HE, Lister C, West JH, Bernhardt JM. Behavioral functionality of mobile apps in health interventions: a systematic review of the literature. JMIR mHealth uHealth. 2015;3:e20.

    PubMed  PubMed Central  Article  Google Scholar 

  181. PCAST (President’s Council of Advisors on Science and Technology). Big data and privacy: a technological perspective. 2014. Accessed 8 Oct 2016.

  182. Pentland A. Reinventing society in the wake of big data. Edge. 2012. Accessed 8 Oct 2016.

  183. Pescosolido BA. The public stigma of mental illness what do we think; what do we know; what can we prove? J Health Soc Behav. 2013;54:1–21.

    PubMed  PubMed Central  Article  Google Scholar 

  184. Pew Research. HealthFactSheet. 2013. Accessed 8 Oct 2016.

  185. PMI (Precision Medicine Initiative). Cohort program working group report. 2015. Accessed 8 Oct 2016.

  186. Powles J. DeepMind’s data deals raise some serious questions. Financial Times. 2016. Accessed 12 Dec 2016.

  187. Puentes J, Montagner J, Lecornu L, Lähteenmäki J. Quality analysis of sensors data for personal health records on mobile devices. In: Bali R, Troshani I, Goldberg S, Wickramasinghe N, editors. Pervasive health knowledge management. New York: Springer; 2013. p. 103–33.

    Chapter  Google Scholar 

  188. Rana R, Hume M, Reilly J, Jurdak R, Soar J. Opportunistic and context-aware affect sensing on smartphones. IEEE Pervasive Comput. 2016;15:60–9.

    Article  Google Scholar 

  189. Reavley NJ, Jorm AF. The quality of mental disorder information websites: a review. Patient Educ Couns. 2011;85:e16–25.

    PubMed  Article  Google Scholar 

  190. Research 2 Guidance. mHealth App Developer Economics 2016. 2016. Accessed 8 Oct 2016.

  191. Resnik P, Armstrong W, Claudino L, Nguyen T, Nguyen VA, Boyd-Graber J. Beyond LDA: exploring supervised topic modeling for depression-related language in Twitter. NAACL HLT 2015. (North American Chapter of the Association for Computational Linguistics—Human Language Technologies 2015). Accessed 8 Oct 2016.

  192. Robillard JM, Illes J, Arcand M, Beattie BL, Hayden S, Lawrence P, et al. Scientific and ethical features of English-language online tests for Alzheimer’s disease. Alzheimers Dement (Amst). 2015;1:281–8.

    Google Scholar 

  193. Robins D, Holmes J, Stansbury M. Consumer health information on the web: the relationship of visual design and perceptions of credibility. J Assoc Inf Sci Technol. 2010;61:13–29.

    Article  Google Scholar 

  194. Robinson D, Yu H, Rieke A. Civil rights, big data, and our algorithmic future. 2014. The Leadership Conference. 2014. Accessed 8 Oct 2016.

  195. Rosenberg E. The business of Google. Investopedia. 2016. Accessed 8 Oct 2016.

  196. Rosenblat A, Kneese T, Boyd D. Networked employment discrimination. 2014. Accessed 8 Oct 2016.

  197. Ruoff A. Privacy laws stifling medical innovation, lawmakers say. Bloomberg Legal. 2016. Accessed 8 Oct 2016.

  198. Ryan A, Wilson S. Internet healthcare: do self-diagnosis sites do more harm than good? Expert Opin Drug Saf. 2008;7:227–9.

    PubMed  Article  Google Scholar 

  199. Saeb S, Zhang M, Karr CJ, Schueller SM, Corden ME, Kording KP, et al. Mobile phone sensor correlates of depressive symptom severity in daily-life behavior: an exploratory study. J Med Internet Res. 2015;17:e175.

    PubMed  PubMed Central  Article  Google Scholar 

  200. Saint N. Google CEO: “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” 2010. Accessed 8 Oct 2016.

  201. Sariyanidi E, Gunes H, Cavallaro A. Automatic analysis of facial affect: a survey of registration, representation, and recognition. IEEE Trans Pattern Anal Mach Intell. 2015;37:1113–33.

    PubMed  Article  Google Scholar 

  202. Sarma S. I helped invent the Internet of Things. Here’s why I’m worried about how secure it is. Politico. 2015. Accessed 8 Oct 2016.

  203. Sarpatwari A, Gagne JJ. Balancing benefits and harms: privacy protection policies. Pharmacoepidemiol Drug Saf. 2016;25:969–71.

    PubMed  Article  Google Scholar 

  204. Savage N. When computers stand in the schoolhouse door. Commun ACM. 2016;59:19–21.

    Article  Google Scholar 

  205. Schmitz A. Secret consumer scores and segmentations: separating consumer ‘haves’ from ‘have-nots’. Michigan State Law Review. 2014:1411. Accessed 8 Oct 2016.

  206. Schneier B. The Internet of Things will turn large-scale hacks into real world disasters. Motherboard. 2016. Accessed 8 Oct 2016.

  207. Schüll ND. Data for life: wearable technology and the design of self-care. BioSocieties. 2016;11:317–33.

    Article  Google Scholar 

  208. Schwartz PH, Caine K, Alpert SA, Meslin EM, Carroll AE, Tierney WM. Patient preferences in controlling access to their electronic health records: a prospective cohort study in primary care. J Gen Intern Med. 2015;30:25–30.

    Article  Google Scholar 

  209. Selwyn N. The digital native-myth and reality. Aslib Proc. 2009;61:364–79.

    Article  Google Scholar 

  210. Semigran HL, Linder JA, Gidengil C, Mehrotra A. Evaluation of symptom checkers for self diagnosis and triage: audit study. BMJ. 2015;351:h3480.

    PubMed  PubMed Central  Article  Google Scholar 

  211. Shekhter S. Every step you take, they’ll be watching you: the legal and practical implications of lifetime GPS monitoring of sex offenders. Hastings Const Law Q. 2010;38:1085.

    Google Scholar 

  212. Shen N, Levitan MJ, Johnson A, Bender JL, Hamilton-Page M, Jadad AA, et al. Finding a depression app: a review and content analysis of the depression app marketplace. JMIR Mhealth Uhealth. 2015;3:e16.

    PubMed  PubMed Central  Article  Google Scholar 

  213. Sheng S, Holbrook M, Kumaraguru P, Cranor LF, Downs J. Who falls for phish?: a demographic analysis of phishing susceptibility and effectiveness of interventions. In: Proceedings of ACM CHI 2010 conference on human factors in computing systems. ACM; 2010. P. 373–82.

  214. Siau K, Shen Z. Building customer trust in mobile commerce. Commun ACM. 2003;46:91–4.

    Article  Google Scholar 

  215. Slabodkin G. IBM CEO: Watson health is ‘our moonshot’ in healthcare. Information Management. 2015. Accessed 8 Oct 2016.

  216. Snell E. Patient data breach fear hinders health data sharing. HealthITSecurity. 2017. Accessed 21 Jan 2017.

  217. Stark M, Fins JJ. Engineering medical decisions. Camb Q Healthc Ethics. 2013;22:373–81.

    PubMed  Article  Google Scholar 

  218. Storni C. Design challenges for ubiquitous and personal computing in chronic disease care and patient empowerment: a case study rethinking diabetes self-monitoring. Pers Ubiquitous Comput. 2014;18:1277–90.

    Article  Google Scholar 

  219. Sun FT, Kuo C, Cheng HT, Buthpitiya S, Collins P, Griss M. Activity-aware mental stress detection using physiological sensors. In: Gris M, Yang G, editors. Mobile computing, applications, and services. Berlin: Springer; 2010. p. 211–30.

    Google Scholar 

  220. Sutton T. Artificial intelligence is your health advisor. Designmind. 2016. Accessed 8 Oct 2016.

  221. Sweeney L. Discrimination in online ad delivery. Queue. 2013;11:10.

    Article  Google Scholar 

  222. Tamir DI, Mitchell JP. Disclosing information about the self is intrinsically rewarding. Proc Natl Acad Sci USA. 2012;109:8038–43.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  223. Tene O, Polonetsky J. A theory of creepy: technology, privacy and shifting social norms. Yale J Law Technol. 2013;16:59.

    Google Scholar 

  224. Topol EJ, Steinhubl SR, Torkamani A. Digital medical tools and sensors. JAMA. 2015;313:353–4.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  225. Turow J, King J, Hoofnagle C, Bleakley A, Hennessy M. Americans reject tailored advertising and three activities that enable it. 2009. Accessed 8 Oct 2016.

  226. UCSF (University of California San Francisco) Fresno. Resources to help families and children. Edinburgh Postnatal Depression Scale. 2013 Accessed 8 Oct 2016.

  227. US Census. Health insurance coverage in the United States: 2015. 2016. Accessed 8 Oct 2016.

  228. USPTF (US Preventive Services Task Force). Depression Screening. 2015. Accessed 8 Oct 2016.

  229. VA. Check your mental health. 2016. Accessed 8 Oct 2016.

  230. Valentino-Devries J, Singer-Vine J, Soltani A. Websites vary prices, deals based on users. Wall Street Journal. 2012. Accessed 8 Oct 2016.

  231. Van Der Velden M, El Emam K. “Not all my friends need to know”: a qualitative study of teenage patients, privacy, and social media. J Am Med Inform Assoc. 2013;20:16–24.

    PubMed  PubMed Central  Article  Google Scholar 

  232. van Deursen AJ, Van Dijk JA. The digital divide shifts to differences in usage. New Media Soc. 2014;16:507–26.

    Article  Google Scholar 

  233. Wald R, Khoshgoftaar TM, Napolitano A, Sumner C. Using Twitter content to predict psychopathy. In: 11th international conference on machine learning and applications (ICMLA). IEEE; 2012. p. 394–401.

  234. Walker J. Data mining to recruit sick people. Wall Street Journal. 2013. Accessed 8 Oct 2016.

  235. Wang R, Aung MS, Abdullah S, Brian R, Campbell AT, Choudhury T, et al. CrossCheck: toward passive sensing and detection of mental health changes in people with schizophrenia. In: Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing. ACM; 2016. P. 886–97.

  236. Webb T, Joseph J, Yardley L, Michie S. Using the internet to promote health behavior change: a systematic review and meta-analysis of the impact of theoretical basis, use of behavior change techniques, and mode of delivery on efficacy. J Med Internet Res. 2010;12(1):e4.

    PubMed  PubMed Central  Article  Google Scholar 

  237. WEF (World Economic Forum). Digital transformation of industries: healthcare. 2016. Accessed 8 Oct 2016.

  238. WEF. Rethinking personal data: strengthening trust. 2012 Accessed 8 Oct 2016.

  239. Whitby B. The ethical implications of non-human agency in health care. In: Proceedings of MEMCA-14: (Machine ethics in the context of medical and care agents). 2014. Accessed 8 Oct 2016.

  240. White House. FACT SHEET: President Obama’s precision medicine initiative. 2015. Accessed 8 Oct 2016.

  241. Wigan MR, Clarke R. Big data’s big unintended consequences. Computer. 2013;46:46–53.

    Article  Google Scholar 

  242. Wong A. ONC health IT buzz. Announcing the behavioral health patient empowerment challenge. 2013. Accessed 8 Oct 2016.

  243. Wu P, Fuller C, Liu X, Lee HC, Fan B, Hoven CW, et al. Use of complementary and alternative medicine among women with depression: results of a national survey. Psychiatr Serv. 2007;58:349–56.

    PubMed  Article  Google Scholar 

  244. Wurzman R, Hamilton RH, Pascual-Leone A, Fox MD. An open letter concerning do-it-yourself users of transcranial direct current stimulation. Ann Neurol. 2016;80:1–4.

    PubMed  Article  Google Scholar 

  245. Xiong W, Droppo J, Huang X, Seide F, Seltzer M, Stolcke A, Yu D, Zweig G. Achieving human parity in conversational speech recognition. 2016.

  246. Yulinsky C. Decisions, decisions … will ‘Big Data’ have ‘Big’ impact? Financial Times. 2012. Accessed 8 Oct 2016.

  247. Zeng Z, Pantic M, Roisman GI, Huang TS. A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell. 2009;31:39–58.

    PubMed  Article  Google Scholar 

  248. Zhao W, Chellappa R, Phillips PJ, Rosenfeld A. Face recognition: a literature survey. ACM Comput Surv (CSUR). 2003;35:399–458.

    Article  Google Scholar 

Download references



Authors’ contributors

All authors were involved in the draft manuscript and initial review. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Author information



Corresponding author

Correspondence to Michael Bauer.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Bauer, M., Glenn, T., Monteith, S. et al. Ethical perspectives on recommending digital technology for patients with mental illness. Int J Bipolar Disord 5, 6 (2017).

Download citation


  • Mental illness
  • Digital healthcare
  • Ethics
  • Digital economy
  • Privacy