The Hub is now in read-only mode as we make improvements to the Hub experience. More information is available here.
09-20-2024 09:52 AM
I noticed this in your latest announcement about updates to the TOS/privacy policy:
Al features. Like so many of you, we’re excited about AI, and we’re identifying ways it can enhance your product experience. We explain how we use data in our AI features.
(Sidenote: this is just terrible messaging, not everyone is “excited about AI,” and it’s basically a triggering word for a lot of people, who will write off your entire product because they have come to associate “AI” with “en**bleep**tification.”)
The only explanation of using data in AI features I could find was in the privacy policy (identical language appears in Legal Bases):
This is…not a lot of explanation. At a minimum I’d like to know:
More specifically, I’d like to be able to opt in or out of AI-assisted features (i.e. prohibit the use of my data in training a model.) For example, I might want to forego appearing on leaderboards if it means my data aren’t used to train anomaly detection.
10-02-2024 03:31 AM
Hello all,
Thank you for your patience as we worked towards replying to this request. We are happy to share more about how you can opt out of the use of your data for AI features.
Athlete Intelligence, a beta feature that provides training analysis and insights, is currently our only generative AI-powered feature. Athlete Intelligence is currently only available to a limited set of private Beta testers but we hope to open it more widely to athletes with a Strava subscription in the near future. Users can opt-out at any time (by clicking the “Leave beta” button within the feedback module). Athlete Intelligence insights and summaries are only visible to the owner of the Strava account and are not shared more widely or used to train public AI models.
Strava also leverages non-LLM AI and machine learning to improve certain backend features, such as the detection of anomalies in leaderboard entries and route recommendations. We do not use large language models for these features. We develop and train models internally and do not use or share the activity data with third parties.
If you do not wish to contribute to route recommendations, you can disable the Aggregated Data Usage privacy control or make your activities visible to “Followers” or “Only You.” At this time, only public activities that use the “Everyone" privacy setting are included in leaderboard anomaly detection training.
We appreciate your feedback and hope this information helps address your concerns.
10-03-2024 11:00 AM
I have a problem accepting all the changes before I can access my own data.
How do I download my data so I can then delete the app?
I also need a refund of the rest of my billing term.
Thanks Richard
10-02-2024 07:11 AM - edited 10-02-2024 07:13 AM
Regardless of the explanation if anyone wants to use Strava they have to agree to terms and conditions. Since terms and conditions do not segregate users based on who opted out of what features, it would be honest and fair to make two sets of T&C, for those who want to "opt-in" to AI features one way or the other and those who do not want to. At this point, the only way to opt out is not to agree to T&C, which renders Strava a useless service and app on my phone. So why should anyone who does not want to be thrown into AI h-e-l-l- keep using Strava?
10-02-2024 06:51 AM
09-23-2024 08:29 AM - edited 09-23-2024 08:30 AM
This will most likely go viral on LinkedIn, other social sites, once the legal and tech industry practitioners discover, similar to how other mass data ingesting platforms have suffered in recent months.
It would be great if Strava gets ahead of it and addresses with the level and granularity of transparency in a manner that fosters customer trust. The AI feature release could be pushed back on the roadmap in order to develop the compliance mechanisms needed.
09-23-2024 06:46 AM
Opting users into AI raises several privacy and ethical concerns. Here are a few specific examples:
I realize that many of these may seem like a stretch, but given the vagueness of the privacy policy update, I have to assume the worst case scenario that Strava will not only use user data to train AI models, but they will share or monetize this data in the future. Worse yet is if they don't develop the AI in-house and a 3rd party company and have much less control over how that data will be used or shared in the future. And that is a concern because the privacy policy specifically has a "Categories of third parties that we may share your information with are listed here ." which leads to a ":todo" meaning this is an area they plan to update.
In short, Strava can't pretend to care at all about privacy with auto-opting users in to this. I eagerly await a response from Strava.
09-21-2024 03:12 AM
I believe that training the AI with flagged and unflagged activities to learn when an activity has the wrong activity type or had motor assistence will be highly beneficial for all the users that care about the integrity of leaderboards. If only activities and data streams visible to everyone are used (I don't know that) then no privacy has been breached. But I recognize that using AI in a psychological sense goes further then the already happening processing of data and therefore it would be wise to enact user options for that. I'm only concerned that an opt-in makes the whole AI thing useless because we know from the fly-by feature that so many people don't even know there is an opt-in for that. Maybe it could be promoted a bit more than just some checkbox in the settings.
09-21-2024 01:46 AM
Does anybody know where to report this to the EU commission?
09-21-2024 02:21 AM
I think this would be through the body in your country. https://digital-strategy.ec.europa.eu/en/policies/dsa-dscs
Welcome to the Community - here is your guide to help you get started!