Headline & Intro
The lawsuit between Scholly founder Chris Gray and student-loan giant Sallie Mae is not just another messy post‑acquisition story. It goes to the heart of a bigger question: when mission‑driven apps serving vulnerable users are acquired by large financial players, what happens to the data and the trust that built them?
If Gray’s allegations are even partly correct, this case could become a textbook example of how student data – including data on minors and racial background – is quietly turned into an asset class. In this piece, we’ll unpack what happened, why it matters far beyond the U.S., and what lessons founders, regulators and students should take from it.
The News in Brief
According to TechCrunch, Chris Gray, co‑founder of scholarship‑search startup Scholly, has sued its acquirer Sallie Mae in Delaware Superior Court and filed a whistleblower complaint with the U.S. Securities and Exchange Commission.
Gray sold Scholly to Sallie Mae in July 2023 and joined as a vice president. Around a year later, he alleges, Sallie Mae laid off the Scholly founding team and then fired him after he raised concerns internally about how student data from Scholly was being used.
The filings reviewed by TechCrunch claim that Sallie Mae routed Scholly into a non‑bank subsidiary, SLM Education Services, which operates Sallie.com, and that this entity sells user data – including age, gender, race, financial‑need indicators and other details – to advertisers, universities and data brokers. Gray argues that this structure allows Sallie Mae to sidestep stricter rules that apply to its regulated banking arm.
Sallie Mae has publicly denied the allegations, calling them baseless, and says it will vigorously defend itself. It has not answered detailed questions about its data practices.
Why This Matters
At first glance, this is a dispute between a founder and a financial institution. Look closer and it becomes a stress test of three fragile systems at once: edtech, fintech and data governance.
1. The trust problem in mission‑driven startups
Scholly built its brand on helping mostly young, often low‑income students navigate scholarships – a population with limited bargaining power and little awareness of how their data can be monetised. If a company like that is later folded into a data‑selling machine, the message to users is simple: your trust expires at exit.
Founders of social‑impact apps should pay attention. Promising “we’ll never sell your data” is meaningless if the company can be acquired by someone who will. Without strong contractual safeguards or regulatory constraints, an exit can flip a data‑ethics switch overnight.
2. Regulatory arbitrage as a product strategy
Gray alleges that Sallie Mae placed Scholly inside a non‑bank subsidiary so it could do things the regulated bank cannot – namely, sell detailed student profiles. If true, this is a classic case of regulatory arbitrage: use corporate structure, not product innovation, as your competitive edge.
This is not new in financial services, but the stakes are higher when the data involves minors, race, and financial distress signals. Even in the U.S., where privacy law is fragmented, this is the kind of grey zone that attracts the attention of the Consumer Financial Protection Bureau and state attorneys general.
3. The quiet normalisation of student surveillance
Edtech already has a troubling record: proctoring tools that monitor eye movement, learning platforms that track every click, universities buying data to predict which students will drop out. A data‑rich scholarship app plugs neatly into that ecosystem. Sold to advertisers and institutions, this data helps construct a behavioural dossier about teenagers before they have any real say in how it is used.
The direct winners here, if the allegations hold, are data brokers and marketers who gain highly granular access to Gen Z and Gen Alpha. The losers are students, who are profiled and targeted at the exact moment they are most financially vulnerable.
The Bigger Picture
The Scholly story fits into a broader pattern: as growth in consumer fintech slows, incumbents are looking for new revenue streams in data and media.
We’ve already seen:
- Banks rebranding as “platforms” and “media networks”. Sallie Mae’s Sallie.com and its "Backpack Media" network, described in company materials as a way for brands to reach young audiences, look strikingly similar to what card schemes and neobanks are doing with merchant‑funded offers and ad‑tech style targeting.
- Edtech shifting from subscription models to data monetisation. Many learning platforms started out charging small fees and later introduced advertising, lead‑generation for universities or job‑placement partners. When subscription revenue plateaus, data becomes the obvious lever.
- A history of aggressive practices around student loans. Navient, the former Sallie Mae spinoff, has already paid large settlements over what regulators called predatory behavior. That backdrop matters: the incentives in this market have not historically rewarded restraint.
In this context, the alleged sale of Scholly data is less an outlier and more a continuation of a trend: financial and educational players turning their user bases into ad networks.
What’s different now is the sensitivity of the attributes involved. Age, race, financial hardship and education records sit very close to categories that, in many jurisdictions, are considered “special” or “sensitive” data. Combining them with geolocation and contact details creates a profile that is not far from a credit score – but with fewer legal protections.
Compared with Big Tech, financial institutions have traditionally enjoyed a reputation for being boring but safe with data. That moat is eroding. If banks start acting like ad‑tech companies, they can expect to be judged – and eventually regulated – like ad‑tech companies.
The European / Regional Angle
European readers might be tempted to dismiss this as a U.S.‑only drama. That would be a mistake.
Under the EU’s GDPR, much of the conduct alleged here would run into immediate trouble:
- Legal basis and transparency. Selling student data – especially involving minors and attributes like race and financial need – would almost certainly require explicit, well‑informed consent. Burying this in a generic privacy policy is unlikely to pass the test.
- Purpose limitation. Collecting data to match scholarships and then repurposing it to fuel an ad network would clash with GDPR’s strict purpose limitation rules unless users clearly agreed to that secondary use.
- Special categories and minors. Data about race and, depending on context, health or socio‑economic hardship touches on special‑category processing. For minors, regulators apply an even higher bar.
European banks, neobanks and scholarship‑matching platforms should read the Scholly case as a warning: regulators will not distinguish much between a “financial institution” and an “education solutions company” when the same corporate group is pulling the strings. The EU’s Digital Services Act and the upcoming AI Act both push in the same direction – more accountability for profiling and targeting, especially of young users.
For European students increasingly turning to private platforms to find scholarships, internships and side hustles, the key question is: who is your app really working for – you, or the advertisers behind the scenes?
Looking Ahead
The legal process in Delaware and any SEC review will take years, not months. But several things are likely to happen much sooner.
1. Regulatory curiosity will increase. Even without formal investigations, U.S. and possibly EU regulators will quietly study the Scholly case as a case study in data use around minors and financial products. Expect conference speeches, soft guidance and maybe new rule‑making that closes obvious loopholes around subsidiaries and data sales.
2. M&A terms in edtech/fintech will harden. Founders who have built trust with sensitive populations will start pushing for stronger data‑use covenants in acquisition agreements: restrictions on onward sale, requirements for user re‑consent, maybe even sunset clauses on particularly sensitive data.
3. A branding and UX arms race. Gray’s complaint highlights how similar branding between Sallie Mae and Sallie.com could confuse students. You can expect more regulatory focus on “dark patterns” and brand mimicry, and more companies spinning up "friendly" sub‑brands to distance hard‑sell tactics from regulated core entities.
4. Pressure for a “data fiduciary” model. There is a growing movement, especially in academic and policy circles, to treat certain data holders – like educational and financial actors – as fiduciaries with duties of loyalty and care toward users. Cases like this strengthen the argument that basic contract law and privacy policies are not enough.
The biggest unknown is how much hard evidence Gray can bring to court. Internal emails, product roadmaps, sales decks and data‑sharing contracts will decide whether this becomes a footnote or a landmark in student‑data governance.
The Bottom Line
Scholly vs. Sallie Mae is about more than one founder’s exit gone sour. It exposes how easily a product built to expand access to education can be flipped into a targeting engine for advertisers and lenders.
If you build products for students or other vulnerable groups, assume your company will be sold one day and design your data practices for that reality. If you’re a regulator, ask yourself whether today’s rules really stop regulated players from doing through subsidiaries what they cannot do directly.
The uncomfortable question for all of us: how much of our “helpful” education tech is really just the onboarding layer for the next generation of data‑driven debt?



