CCHR Warns of College Students’ Rights Violations Using Mental Health Apps

College students have been the target of marketing to use mental health apps, without realizing that their personal information can be hawked to social media outlets and used for data mining. Some apps follow a user’s social media messages to arbitrarily “predict” behavior in need of treatment, while sensors are now embedded in an antipsychotic drug to be electronically tracked to ensure the patient takes it–a practice described as new age surveillance. The mental health industry watchdog, Citizens Commission on Human Rights International (CCHR) is concerned that college and other students may be unaware that educational problems are a target for unscrupulous “mental health” marketing.

Former nurses for one mental health app company told Bloomberg they “feared that they were fueling a new addiction crisis” by making stimulants and amphetamines so easy to get.[1]

There are approximately 10,000 to 20,000 mental health apps available, according to the American Psychological Association.[2] These offer everything from assessment to treatment that includes powerful and potentially debilitating drugs. The global mental health apps market is lucrative, expected to grow from $4.7 billion in 2021 to $5.5 billion in 2022.[3] These, then, can help further fuel the nearly $26 billion a year in psychotropic drug sales in the U.S.[4]

Patients who receive health care services are protected by the Health Insurance Portability and Accountability Act of 1996 (HIPAA), which prohibits disclosure of sensitive patient health information without the patient’s consent.[5] That protection isn’t available with mental health apps.[6] Roger Severino, a former director of the Department of Health and Human Services’ Office for Civil Rights, the federal agency charged with enforcing health privacy rules, pointed out: “Tell a psychologist, ‘I’m depressed,’ and HIPAA restricts how that information can be used. But type those same words into an app that has no connection to a covered entity [doctor, insurance company, etc. which is liable under HIPAA], and HIPAA doesn’t protect you.”[7]

The Washington Post reports, “When data from a mental health app is shared or sold to other parties, a wealth of information can be used for purposes beyond the health needs of students. Insurers can use it to calculate premiums, employers can use it to assess risk, advertisers can use it to tailor ads to consumer preferences or conditions, and all can exploit students’ weaknesses.”[8]

Mental health apps can be virtual pharmacies through which psychiatric drugs can be prescribed, including potentially addictive stimulants and antidepressants that have suicidal side effects and are especially dangerous for college-aged students. A stringent warning says antidepressants taken by those aged up to 24 puts them at risk of suicidal behavior.

A Bloomberg investigation of one major mental health app company found evidence of harmful “overtreatment.” For example, a user alleged she was prescribed three antidepressants, an anticonvulsant, and an antipsychotic over the course of three months. One of these drugs was prescribed after just an 18-minute video visit.[9]

Another consumer told CBS News that in her first roughly 15-minute appointment with a mental health app prescriber she was prescribed three drugs and in a second, equally brief appointment, was given two more. She worsened and alerted her prescriber about nightmares she was having about hanging herself but said she was brushed off. The next day, a family member found her hanging from a dog leash in her bathroom. She does not remember anything from the incident and thought she had been “dreaming.”[10]

The standard and validity of psychiatric diagnoses were already shockingly poor. Psychologist Lisa Cosgrove and co-researchers published a report on mental health app surveillance tools, pointing out, “The lack of biomarkers, or objective measurements, to determine mental disorders has plagued psychiatry and resulted in concerns about the validity of psychiatric disorders.” Consequently, psychiatrists are turning their attention to digital phenotyping (tracking a set of observable characteristics or traits of an organism), promoted as an objective way to measure–and supposedly predict–traits, behavior, and mood. Digital phenotyping technology uses sensors that can track an individual’s behavior, location, and speech patterns (e.g., intonation).[11]

It goes further to use sensors to enforce psychiatric drug compliance. One antipsychotic (aripiprazole) has an embedded sensor that once digested communicates to a wearable patch, and transmits data to a smartphone app to report the patient has taken the drug.[12] Cosgrove and colleagues concluded that “the advent of digital psychotropic drugs marks a new age in surveillance and poses risks to privacy and human rights, possibly in ways yet unimagined.” These digital technologies “promote practices that violate the right to freedom, including freedom from coercive or degrading treatment.”[13]

Jan Eastgate, president of CCHR International says CCHR’s Mental Health Declaration of Human Rights includes many of the rights that college students and mental health treatment consumers should be aware of. “Should they be harmed, they can confidentially report abuse to CCHR and we can assist them to take action,” she said. The group has more than 50 years of documenting and bringing to account criminal and human rights abuses committed in the guise of mental health care.

Read the full article here.

[1] “How mental health apps can accelerate the psychiatric prescribing cascade,” Lown Institute, 18 Mar. 2022, lowninstitute.org/how-mental-health-apps-can-accelerate-the-psychiatric-prescribing-cascade/

[2] www.popsci.com/science/mental-health-apps-safety/

[3] www.thebusinessresearchcompany.com/report/mental-health-apps-global-market-report

[4] www.cchrint.org/2020/05/19/the-brave-new-world-of-artificial-intelligence-in-mental-health/

[5] www.cdc.gov/phlp/publications/topic/hipaa.html

[6] Op. cit., Lown Institute

[7] Thomas Germain, “Mental Health Apps Aren’t All As Private As You May Think,” Consumer Reports, 2 Mar. 2021, www.consumerreports.org/health-privacy/mental-health-apps-and-user-privacy-a7415198244/

[8] Deanna Paul, “Colleges want freshmen to use mental health apps. But are they risking students’ privacy?” Washington Post, 2 Jan. 2020, www.washingtonpost.com/technology/2019/12/27/colleges-want-freshmen-use-mental-health-apps-are-they-risking-students-privacy/

[9] Op. cit., Lown Institute

[10] Anna Werner, “Expert alarmed by mental health app Cerebral’s speedy sessions and prescriber qualifications,” CBS News, 7 Sept. 2022, www.cbsnews.com/news/cerebral-app-mental-health/

[11] Lisa Cosgrove Ph.D., et al., “Digital Phenotyping and Digital Psychotropic Drugs: Mental Health Surveillance Tools That Threaten Human Rights,” Health Human Rights, Dec. 2020, www.ncbi.nlm.nih.gov/pmc/articles/PMC7762923/

[12] journalbipolardisorders.springeropen.com/articles/10.1186/s40345-019-0164-x

[13] Op. cit., Health Human Rights

Citizens Commission on Human Rights International
media@cchr.org
+1-323-467-4242
6616 Sunset Boulevard

United States

comtex tracking

COMTEX_419413541/2764/2022-11-21T12:08:17