{"version":"https://jsonfeed.org/version/1.1","title":"The Cirdia Pulse | Biometric insights, community voices","home_page_url":"https://cirdia.com","feed_url":"https://blog-cirdia-com.pages.dev/json/","description":"","icon":"https://media-cdn.cirdia.com/blog-cirdia-com/production/images/channel-da9a1999f4ce1c8e2690c8189769a4e0.jpg","favicon":"https://blog-cirdia-com.pages.dev/assets/default/favicon.png","language":"en-us","items":[{"id":"tccNkHqJ-Nh","title":"Your Data, Your Body: Why Cirdia's Privacy Approach Matters","attachments":[{"url":"https://media-cdn.cirdia.com/blog-cirdia-com/production/media/image-09e1b2fd669e3880f66030a58fcd370d.jpg","mime_type":"image/jpeg","size_in_byte":1659557}],"url":"https://blog.cirdia.com/i/tccNkHqJ-Nh/","content_html":"<p>Every transformative technology begins with a moment of clarity—that instant when you realize the status quo isn't just inconvenient, but fundamentally misaligned with human dignity. For me, that moment came when I systematically compared how different wearable companies handle our most intimate body data.</p><h2><strong>The Revelation: When I Compared the Fine Print</strong></h2><p>Unlike most people, I've always been drawn to legal agreements. There's something fascinating about the precision of language, the careful boundaries being established, the dance between protection and permission. I'm that odd person who actually enjoys reading terms of service—each one a window into corporate values and priorities.</p><p>So I embarked on a comprehensive audit—downloading, printing, and meticulously comparing the privacy policies and terms of service from every major player in the wearable space. I created spreadsheets tracking key provisions, highlighted concerning clauses, and mapped how permissions flowed across their ecosystems.</p><p>What I discovered was deeply troubling. These companies weren't just collecting data—they were claiming sweeping rights to use, change, publish, and share our most intimate biometric information however they wanted. Most disturbing was the almost complete silence on AI training, with few making clear promises about not feeding your heart rate, sleep patterns, and stress levels into larger systems.</p><h2><strong>The Crucial Distinction: Biometric Data vs. Personal Information</strong></h2><p>What became abundantly clear through my research was a fundamental problem in how companies handle our most sensitive information. Most wearable companies make no meaningful distinction between your biometric data—the intimate measurements of your body's functions—and other personal information like your email address or birthday.</p><p>This false equivalence creates the foundation for deeply problematic data practices. When your heart rate variability is treated with the same privacy protections as your zip code, something has gone terribly wrong.</p><h2><strong>The Cirdia Difference: Privacy as Our Foundation</strong></h2><p>At Cirdia, we've taken a fundamentally different approach. We believe your biometric data deserves special protection—not just in policy documents, but in how our entire system is designed. As a Public Benefit Corporation chartered in Colorado, we have a legal obligation that goes beyond profit maximization. Our corporate charter explicitly commits us to:</p><p>\"Empowering User Agency: Building systems that respect users as competent decision-makers by providing transparent opt-in processes, collaborative research opportunities, and meaningful control over their wellness journey and data sharing choices.\"</p><p>This isn't just aspiration—it's a legally binding commitment that shapes every aspect of our product and business.</p><p>We make a clear legal and technical distinction between your biometric data and your account information. Your body's measurements receive fundamentally different privacy protections than your email address or other account details. This isn't just semantic—it shapes how our entire system is built, with local-first processing that keeps your raw biometric data on your device, not our servers.</p><p>While some companies like Apple and Oura have taken steps in the right direction, Cirdia was created from the ground up with privacy and user agency as our core design principles—not features added later or compromises made within a different business model.</p><h2><strong>From Personal Revelation to Industry Contrast</strong></h2><p>Let me share what I discovered when I actually took the time to decode what other companies are claiming rights to do:</p><h3><strong>Fitbit/Google: Your Body as a Data Mine</strong></h3><p>Fitbit's terms state explicitly that when you share content through their services, you grant them \"the right to use, copy, modify, publicly display, publicly perform, reproduce, translate, create derivative works from, and distribute your content\". That heartfelt journal entry about your health struggles? That's now Google's content to use as they see fit.</p><p>Even more concerning, when you use Fitbit with a Google account, your data is handled according to Google's privacy practices. This means your biometric data becomes part of Google's vast data ecosystem—the same one powering their advertising empire. While they have publicly said they won’t combine your biometrics with ads, they have not bothered to make that promise legally binding in either their terms of service or their privacy policy, which means they can try it out and even implement this without consequence.</p><h3><strong>WHOOP: The Surveillance Business Model</strong></h3><p>Unlike what the sleek marketing suggests, WHOOP's privacy policy reveals a business model built on surveillance. They collect data through cookies and other automated technologies, including tracking your interactions over time across the web and other services. They and their advertising partners use this information to serve you targeted ads.</p><p>Your sleep patterns, recovery metrics, and heart rate variability become inputs for advertising algorithms. WHOOP shares your data with \"service providers, vendors who advertise our Services or other WHOOP products, security and fraud prevention consultants, analytics providers, and staff augmentation and contract personnel\". That's an extraordinarily broad network of third parties gaining access to your most intimate biometric data.</p><p>Most concerning is that WHOOP makes no technical or legal distinction between your biometric data and regular personal information—it's all treated as a resource to be leveraged for business purposes.</p><h3><strong>Samsung: No Control, No Choice</strong></h3><p>Samsung's Health terms of service are particularly aggressive in diminishing user control. They \"EXPRESSLY DISCLAIM ANY AND ALL LIABILITY\" for how your health information is used, while simultaneously reserving the right to \"remove or disable access to the Fitbit Service, any Fitbit Content, or Your Content at any time and without notice\".</p><p>In other words, they can delete your health history at any time with no warning, but accept no responsibility for how that same data is used or shared. This imbalance of power is staggering.</p><h3><strong>Xiaomi: Your Data, Their Ecosystem</strong></h3><p>Xiaomi's approach to privacy is particularly alarming for global users. They state explicitly that they \"may use and combine the information we collect about you from Samsung Health with data from other services or features you use and your devices, and other sources, to provide you with a better experience\". This broad combination of data across services creates comprehensive user profiles.</p><p>Making no distinction between biometric data and other personal information, Xiaomi freely shares your information with affiliates and third parties for marketing purposes.</p><h2><strong>The Real-World Impact of These Policies</strong></h2><p>These terms and policies aren't just abstract legal frameworks—they have concrete impacts on people's lives:</p><ol><li><strong>Health Insurance Discrimination</strong>: When biometric data is sold or shared with data brokers, it can ultimately influence insurance algorithms, potentially resulting in higher premiums or denied coverage based on activity patterns.</li><li><strong>Sensitive Life Event Exposure</strong>: We've all heard stories of targeted ads revealing pregnancies before women were ready to share the news. These aren't urban myths—they're the predictable outcome of health data being fed into advertising systems.</li><li><strong>Location Tracking Without Consent</strong>: Many fitness apps track your precise location even when not needed for functionality. This creates detailed maps of your movements, habits, and patterns.</li><li><strong>Perpetual Data Retention</strong>: Most fitness platforms retain your data indefinitely, even after you've deleted your account. This creates permanent digital shadows of our physical existence that we can never fully reclaim.</li></ol><h2><strong>Our Design Principles: Better by Design, Not Just Better Terms</strong></h2><h3><strong>Local-First Architecture: Privacy by Design</strong></h3><p>Unlike traditional wearables that require cloud processing, Cirdia uses a local-first approach. Your data is processed primarily on your device or your phone. By design, Cirdia does not store your raw biometric data on centralized servers.</p><p>This architectural choice isn't just more private—it's more resilient. You don't lose access to your health insights when servers go down or companies change policies.</p><h3><strong>Transparent Algorithms</strong></h3><p>All algorithms used in our App are open source or auditable. This transparency extends to how we communicate insights about your health—no black box recommendations or unexplainable guidance.</p><p>When it comes to AI and machine learning, we've reimagined the approach entirely. If we ever incorporate model training using your data, we would only do so through a distributed, local-first framework that keeps your raw biometric data on your device. This means the insights and patterns can improve our collective understanding without your biometric information ever leaving your personal sphere of control.</p><p>This isn't a technical limitation—it's a deliberate architectural choice that aligns with our core values. We believe distributed AI approaches that respect data boundaries are not just more private but ultimately more innovative, drawing insights from diverse experiences while honoring individual autonomy.</p><h3><strong>Data Ownership in Practice</strong></h3><p>You own your data. The App offers tools to visualize, export, or delete your data at any time. This isn't just rhetoric—it's built into how our technology works.</p><p>When you choose to share data with Cirdia for research or product improvement, we follow the principle of data minimization—collecting only what's necessary, for the specific purpose you've consented to, and only for the duration required.</p><h2><strong>Building a Movement, Not Just a Product</strong></h2><p>The wearable device industry has normalized deeply problematic data practices by burying them in legal documents and making them seem inevitable. At Cirdia, we're not just building another device with marginally better terms. We're reimagining what relationship between technology companies and users should look like—one founded on respect, transparency, and genuine partnership.</p><p>When our market research showed that women wanted \"a fitness tracker that doesn't sell you out\" and to \"stay in their bodies, not on their phones,\" they weren't just expressing product preferences. They were articulating a vision for a fundamentally different relationship with technology—one that enhances their embodied experience rather than extracting value from it.</p><h2><strong>Join Us in Reimagining Wearable Privacy</strong></h2><p>Your intimate body data—your heartbeats, sleep patterns, stress levels, and activity—deserve better than becoming inputs for advertising algorithms or assets on corporate balance sheets.</p><p>At Cirdia, we're committed to proving that better approaches aren't just possible—they're essential for the future of ethical technology. Your data, like your body, should always remain yours to control.</p><p>Because wellness isn't about optimization or gamification. It's about presence, autonomy, and living fully in your body. And technology should serve that vision, not undermine it.</p><p>Join our community:<a href=\"https://cirdia.com/\" rel=\"noopener noreferrer\" target=\"_blank\"> https://cirdia.com</a></p><p><em>Mary Camacho is CEO and Co-founder of Cirdia, a Public Benefit Corporation reimagining ethical wellness technology. This post is part of our ongoing commitment to transparency about how we approach privacy and data governance.</em></p>","content_text":"Every transformative technology begins with a moment of clarity—that instant\nwhen you realize the status quo isn't just inconvenient, but fundamentally\nmisaligned with human dignity. For me, that moment came when I systematically\ncompared how different wearable companies handle our most intimate body data.\n\n\nTHE REVELATION: WHEN I COMPARED THE FINE PRINT\n\nUnlike most people, I've always been drawn to legal agreements. There's\nsomething fascinating about the precision of language, the careful boundaries\nbeing established, the dance between protection and permission. I'm that odd\nperson who actually enjoys reading terms of service—each one a window into\ncorporate values and priorities.\n\nSo I embarked on a comprehensive audit—downloading, printing, and meticulously\ncomparing the privacy policies and terms of service from every major player in\nthe wearable space. I created spreadsheets tracking key provisions, highlighted\nconcerning clauses, and mapped how permissions flowed across their ecosystems.\n\nWhat I discovered was deeply troubling. These companies weren't just collecting\ndata—they were claiming sweeping rights to use, change, publish, and share our\nmost intimate biometric information however they wanted. Most disturbing was the\nalmost complete silence on AI training, with few making clear promises about not\nfeeding your heart rate, sleep patterns, and stress levels into larger systems.\n\n\nTHE CRUCIAL DISTINCTION: BIOMETRIC DATA VS. PERSONAL INFORMATION\n\nWhat became abundantly clear through my research was a fundamental problem in\nhow companies handle our most sensitive information. Most wearable companies\nmake no meaningful distinction between your biometric data—the intimate\nmeasurements of your body's functions—and other personal information like your\nemail address or birthday.\n\nThis false equivalence creates the foundation for deeply problematic data\npractices. When your heart rate variability is treated with the same privacy\nprotections as your zip code, something has gone terribly wrong.\n\n\nTHE CIRDIA DIFFERENCE: PRIVACY AS OUR FOUNDATION\n\nAt Cirdia, we've taken a fundamentally different approach. We believe your\nbiometric data deserves special protection—not just in policy documents, but in\nhow our entire system is designed. As a Public Benefit Corporation chartered in\nColorado, we have a legal obligation that goes beyond profit maximization. Our\ncorporate charter explicitly commits us to:\n\n\"Empowering User Agency: Building systems that respect users as competent\ndecision-makers by providing transparent opt-in processes, collaborative\nresearch opportunities, and meaningful control over their wellness journey and\ndata sharing choices.\"\n\nThis isn't just aspiration—it's a legally binding commitment that shapes every\naspect of our product and business.\n\nWe make a clear legal and technical distinction between your biometric data and\nyour account information. Your body's measurements receive fundamentally\ndifferent privacy protections than your email address or other account details.\nThis isn't just semantic—it shapes how our entire system is built, with\nlocal-first processing that keeps your raw biometric data on your device, not\nour servers.\n\nWhile some companies like Apple and Oura have taken steps in the right\ndirection, Cirdia was created from the ground up with privacy and user agency as\nour core design principles—not features added later or compromises made within a\ndifferent business model.\n\n\nFROM PERSONAL REVELATION TO INDUSTRY CONTRAST\n\nLet me share what I discovered when I actually took the time to decode what\nother companies are claiming rights to do:\n\n\nFITBIT/GOOGLE: YOUR BODY AS A DATA MINE\n\nFitbit's terms state explicitly that when you share content through their\nservices, you grant them \"the right to use, copy, modify, publicly display,\npublicly perform, reproduce, translate, create derivative works from, and\ndistribute your content\". That heartfelt journal entry about your health\nstruggles? That's now Google's content to use as they see fit.\n\nEven more concerning, when you use Fitbit with a Google account, your data is\nhandled according to Google's privacy practices. This means your biometric data\nbecomes part of Google's vast data ecosystem—the same one powering their\nadvertising empire. While they have publicly said they won’t combine your\nbiometrics with ads, they have not bothered to make that promise legally binding\nin either their terms of service or their privacy policy, which means they can\ntry it out and even implement this without consequence.\n\n\nWHOOP: THE SURVEILLANCE BUSINESS MODEL\n\nUnlike what the sleek marketing suggests, WHOOP's privacy policy reveals a\nbusiness model built on surveillance. They collect data through cookies and\nother automated technologies, including tracking your interactions over time\nacross the web and other services. They and their advertising partners use this\ninformation to serve you targeted ads.\n\nYour sleep patterns, recovery metrics, and heart rate variability become inputs\nfor advertising algorithms. WHOOP shares your data with \"service providers,\nvendors who advertise our Services or other WHOOP products, security and fraud\nprevention consultants, analytics providers, and staff augmentation and contract\npersonnel\". That's an extraordinarily broad network of third parties gaining\naccess to your most intimate biometric data.\n\nMost concerning is that WHOOP makes no technical or legal distinction between\nyour biometric data and regular personal information—it's all treated as a\nresource to be leveraged for business purposes.\n\n\nSAMSUNG: NO CONTROL, NO CHOICE\n\nSamsung's Health terms of service are particularly aggressive in diminishing\nuser control. They \"EXPRESSLY DISCLAIM ANY AND ALL LIABILITY\" for how your\nhealth information is used, while simultaneously reserving the right to \"remove\nor disable access to the Fitbit Service, any Fitbit Content, or Your Content at\nany time and without notice\".\n\nIn other words, they can delete your health history at any time with no warning,\nbut accept no responsibility for how that same data is used or shared. This\nimbalance of power is staggering.\n\n\nXIAOMI: YOUR DATA, THEIR ECOSYSTEM\n\nXiaomi's approach to privacy is particularly alarming for global users. They\nstate explicitly that they \"may use and combine the information we collect about\nyou from Samsung Health with data from other services or features you use and\nyour devices, and other sources, to provide you with a better experience\". This\nbroad combination of data across services creates comprehensive user profiles.\n\nMaking no distinction between biometric data and other personal information,\nXiaomi freely shares your information with affiliates and third parties for\nmarketing purposes.\n\n\nTHE REAL-WORLD IMPACT OF THESE POLICIES\n\nThese terms and policies aren't just abstract legal frameworks—they have\nconcrete impacts on people's lives:\n\n 1. Health Insurance Discrimination: When biometric data is sold or shared with\n    data brokers, it can ultimately influence insurance algorithms, potentially\n    resulting in higher premiums or denied coverage based on activity patterns.\n 2. Sensitive Life Event Exposure: We've all heard stories of targeted ads\n    revealing pregnancies before women were ready to share the news. These\n    aren't urban myths—they're the predictable outcome of health data being fed\n    into advertising systems.\n 3. Location Tracking Without Consent: Many fitness apps track your precise\n    location even when not needed for functionality. This creates detailed maps\n    of your movements, habits, and patterns.\n 4. Perpetual Data Retention: Most fitness platforms retain your data\n    indefinitely, even after you've deleted your account. This creates permanent\n    digital shadows of our physical existence that we can never fully reclaim.\n\n\nOUR DESIGN PRINCIPLES: BETTER BY DESIGN, NOT JUST BETTER TERMS\n\n\nLOCAL-FIRST ARCHITECTURE: PRIVACY BY DESIGN\n\nUnlike traditional wearables that require cloud processing, Cirdia uses a\nlocal-first approach. Your data is processed primarily on your device or your\nphone. By design, Cirdia does not store your raw biometric data on centralized\nservers.\n\nThis architectural choice isn't just more private—it's more resilient. You don't\nlose access to your health insights when servers go down or companies change\npolicies.\n\n\nTRANSPARENT ALGORITHMS\n\nAll algorithms used in our App are open source or auditable. This transparency\nextends to how we communicate insights about your health—no black box\nrecommendations or unexplainable guidance.\n\nWhen it comes to AI and machine learning, we've reimagined the approach\nentirely. If we ever incorporate model training using your data, we would only\ndo so through a distributed, local-first framework that keeps your raw biometric\ndata on your device. This means the insights and patterns can improve our\ncollective understanding without your biometric information ever leaving your\npersonal sphere of control.\n\nThis isn't a technical limitation—it's a deliberate architectural choice that\naligns with our core values. We believe distributed AI approaches that respect\ndata boundaries are not just more private but ultimately more innovative,\ndrawing insights from diverse experiences while honoring individual autonomy.\n\n\nDATA OWNERSHIP IN PRACTICE\n\nYou own your data. The App offers tools to visualize, export, or delete your\ndata at any time. This isn't just rhetoric—it's built into how our technology\nworks.\n\nWhen you choose to share data with Cirdia for research or product improvement,\nwe follow the principle of data minimization—collecting only what's necessary,\nfor the specific purpose you've consented to, and only for the duration\nrequired.\n\n\nBUILDING A MOVEMENT, NOT JUST A PRODUCT\n\nThe wearable device industry has normalized deeply problematic data practices by\nburying them in legal documents and making them seem inevitable. At Cirdia,\nwe're not just building another device with marginally better terms. We're\nreimagining what relationship between technology companies and users should look\nlike—one founded on respect, transparency, and genuine partnership.\n\nWhen our market research showed that women wanted \"a fitness tracker that\ndoesn't sell you out\" and to \"stay in their bodies, not on their phones,\" they\nweren't just expressing product preferences. They were articulating a vision for\na fundamentally different relationship with technology—one that enhances their\nembodied experience rather than extracting value from it.\n\n\nJOIN US IN REIMAGINING WEARABLE PRIVACY\n\nYour intimate body data—your heartbeats, sleep patterns, stress levels, and\nactivity—deserve better than becoming inputs for advertising algorithms or\nassets on corporate balance sheets.\n\nAt Cirdia, we're committed to proving that better approaches aren't just\npossible—they're essential for the future of ethical technology. Your data, like\nyour body, should always remain yours to control.\n\nBecause wellness isn't about optimization or gamification. It's about presence,\nautonomy, and living fully in your body. And technology should serve that\nvision, not undermine it.\n\nJoin our community: https://cirdia.com\n\nMary Camacho is CEO and Co-founder of Cirdia, a Public Benefit Corporation\nreimagining ethical wellness technology. This post is part of our ongoing\ncommitment to transparency about how we approach privacy and data governance.","image":"https://media-cdn.cirdia.com/blog-cirdia-com/production/images/item-3f4e23a7ca784690fa8533fc0ffa991b.jpg","banner_image":"https://media-cdn.cirdia.com/blog-cirdia-com/production/media/image-09e1b2fd669e3880f66030a58fcd370d.jpg","date_published":"2025-06-05T23:25:00.000Z","_microfeed":{"is_audio":false,"is_document":false,"is_external_url":false,"is_video":false,"is_image":true,"web_url":"https://blog-cirdia-com.pages.dev/i/your-data-your-body-why-cirdias-privacy-approac-tccNkHqJ-Nh/","json_url":"https://blog-cirdia-com.pages.dev/i/tccNkHqJ-Nh/json/","rss_url":"https://blog-cirdia-com.pages.dev/i/tccNkHqJ-Nh/rss/","guid":"tccNkHqJ-Nh","status":"published","itunes:title":"Mary Camacho","itunes:episodeType":"full","date_published_short":"Thu Jun 05 2025","date_published_ms":1749165900000}}],"_microfeed":{"microfeed_version":"0.1.2","base_url":"https://blog-cirdia-com.pages.dev","categories":[{"name":"Health & Fitness"},{"name":"Technology"}],"subscribe_methods":[{"name":"RSS","type":"rss","url":"https://blog-cirdia-com.pages.dev/rss/","image":"https://blog-cirdia-com.pages.dev/assets/brands/subscribe/rss.png","enabled":true,"editable":false,"id":"dBXR_fcmwxX"},{"name":"JSON","type":"json","url":"https://blog-cirdia-com.pages.dev/json/","image":"https://blog-cirdia-com.pages.dev/assets/brands/subscribe/json.png","enabled":true,"editable":false,"id":"LS9clj_DZDP"}],"description_text":"","copyright":"©2025 Cirdia™","itunes:type":"episodic","items_sort_order":"newest_first"}}