https://pixabay.com/users/colin00b-346653/
.
“Company is working with UCLA, Biogen to see if sensitive data like facial expressions, typing metrics could signal mental-health concerns.”
.
.
My two cents: I do not need a machine to tell me I am depressed.
However, this could be a great diagnostic tool for some therapists.
My other concern is Apple having all those biometrics on my privacy and mental health.
I guess they could read this blog and find out way more about me than depression.
What a world we live in.
.
.
Posted by rudid96 on September 21, 2021 at 9:24 pm
I”ve enough discomfort posting in public. The last thing I’d want is for companies to target my vulnerabilities. And for what, to tell me I’m depressed or triggered? I see a therapist. It’s true, I have difficulty, more often than I care to admit, speaking up for myself and being totally transparent. Even if a company could send an alert to my therapist, what would that do? Unless one is checked into a full-treatment facility, a therapist isn’t on-call 24/7. Additionally, that kind of exposure would silence me.
Apple, if you’re listening, No Thank You!
Posted by Marty on September 21, 2021 at 9:30 pm
We are secretive, vulnerable
Posted by rudid96 on September 21, 2021 at 10:38 pm
Yes, we are excruciatingly vulnerable. My vulnerabilities were exploited for much of my life. Leaving in its wake what we have here today; a woman at this late stage of life that is required to work hard to elevate herself from C-PTSD. Secrecy is my safety.
My wins are fragile. The days that I’m simply treading water are tenous.
Intentionally attempting to channel an adult, untraumatized, self is exhausting. I’ve faced enough humiliation. It’s no wonder traumatized people are secretive.
Posted by 7oakley on September 30, 2021 at 9:08 pm
Pretty scary that people will be depending on a machine to assess their mental health. And, guess what (sarcasm here) there is a pill for that. Just call this number:)