
Invisible Exhaust: What EdTech Really Knows About Your Child—and Who Profits
In any given school day, the student will have accessed three to ten learning sites before lunch. By graduation, there will have been amassed data points-grades, behavioral indicators, device usage, biometrics, and composition samples-thousands strong-that will have been hoovered up from the web of third-party software. This information is worth something. To corporations, it’s the input for predictive machines, the “personalized learning” software. To school districts, it’s the key to efficiency. To families and students, however, the data is invisible. And most significantly, never agreed upon in any appreciable way.
Edtech startups, particularly those engaged in the K-12 sector, are well past due for an accounting with what “consent” must mean-legally, ethically, educationally. The question is not so much whether their software is up-to-date on FERPA or COPPA. It’s whether they perpetuate power imbalances that make students data sets instead of learners.
The Myth of Consent in Schooling
As families electronically sign permission slips or “agree” to online enrollments, they are not likely to have any idea how much information is being collected. And nor are they presented with substantial “opt out” options without having to sacrifice access to learning.
Under FERPA (Family Educational Rights and Privacy Act) and COPPA (Children’s Online Privacy Protection Act), student data privacy is defined predominantly as access control-who is permitted to view the records. The actual problem, however, is not transparency but agency.
Most EdTech tools assume the consent is an end-user-checkbox-one-and-done kind of thing. In practice, that means the student is enrolled into enormous digital apparatuses where their composition, their attendance, their behavior, their attention are all tracked, analyzed, even monetized-without going back to the terms under which the consent was given as the data uses grow.
Platform Power and the Algorithmic Imagination
Contemporary learning systems increasingly depend on predictive analytics to alert students “at risk,” rank the quality of their writing, or direct course enrollment. Most are black boxed, their accuracy also contaminated by considerations associated with race, disablement, or language proficiency.
When students are graded or ranked by machine, they’re given little tools to dispute the results or even comprehend the answer. It defies the foundational ideals of education: growth, autonomous choice, and redemption.
Algorithmic systems are reflective of the imagination of the makers-their perceptions of what success is, what risk is, whose achievements count. If unquestioned, their assumptions risk embedding the very roots of systemic bias into the very functions of school.
A Conflict of Incompatible Incentives
Imagine a mid-sized school district joining an EdTech vendor to offer “personalized learning” software. In black and white, the product is up to code on privacy law. But hidden deep within the terms of service is wording that allows third-party information sharing for “product improvement.”
A small-town journalist finds that anonymized student data was licensed to an analytics firm that is creating corporate hiring tools. Parents are incensed. The district claims the data was made anonymous, but the public’s trust is broken.
What this illustration reveals is that compliance does not equate to ethical alignment. Consumers increasingly demand transparency, choice, and control – not fine print.
What Ethical EdTech Might Look Like
In order to actually rethink consent, EdTech sites have to do better than the LEGAL minimum. They have to adopt a pedagogy of consent - a design ethos that views data as something student-owned, not something scrambled out.
Some pragmatic changes may involve:
Transparent, easy-to-understand dashboards revealing what is tracked and how it is utilized
“Consent intervals” incorporated into the user experience at key points
Opt-in but not necessarily opt-out models for experimental features
Widely available impact assessments and audits, particularly for tools based on AI
Co-designing workshops with student and parent feedback before introducing new features
This is not compliance; this is trust. Algorithms that create open, values-based experiences will safeguard student dignity but also future-proof themselves amidst rapid changes in the regulatory environment.
Conclusion: Consent is a Relationship
Relationships are everything in education. Teachers gain the trust of students over time. Schools win over families through talk and tenderness. EdTech needs to do the same.
Consent must not be a box to be checked come the beginning of the semester. It must be an evolving conversation that processes with tools. And if the platform is to be viewed as a partner rather than a vendor, they need to make an investment in the relationship.
A century of students is being molded by technologies they didn’t select and are unable to comprehend. It’s time for EdTech to broaden its creativity-and re-conceptualize what consent must mean in the classroom.
More from Education
Explore related articles on similar topics





