Impact / Research
What we publish. What we learn.
- Access
- Open access, first
- Review
- Peer review, IRB where applicable
- Data
- Aggregated, anonymized, opt-in only
- First publication
- Forthcoming
The research commitment.
The Foundation commits to three things about the research we produce, publish, or participate in. First, it is peer-reviewed when the work is suitable for peer review. Second, it is governed by an Institutional Review Board when it involves human subjects. Third, it is open access by default: results are free to read, data is shared with other researchers under appropriate agreements, and methods are documented in full.
This is not how every nonprofit publishes research. It is how we will.
Partnerships we seek.
We are in early conversations with the kinds of institutions that can hold this work to a high standard, and we will publish them as they are signed. The shape of the partnership matters more than the name above the door.
Universities with wellbeing, public health, or HCI programs.
For longitudinal studies on loneliness, connection, and community formation. Shared authorship, shared IRB oversight, shared data access.
Public-health institutions and health systems.
For co-designed pilots that test whether community-first software changes measurable health outcomes in defined populations.
Independent research committees.
The Foundation will stand up its own research review body, independent of the leadership team, to vet proposals and flag conflicts of interest.
Funders of public-interest research.
Grants that underwrite the studies, not the conclusions. We will decline funding that requires a predetermined finding.
Research we will not do.
Some studies are common in commercial technology. They will not happen here. The charter does not permit them. The board will not approve them. The AI service is built so the instruments to run them are not there by default.
No behavioral-manipulation studies.
No A/B tests of dark patterns, retention loops, or engagement hooks. If a study would only be useful for extracting more time from people, we do not run it.
No emotion-contagion experiments.
Not on users, not on cohorts, not for any partner. The precedent from Facebook in 2014 is one we will not repeat.
No covert instrumentation.
Members are told when they are part of a study. Opt-in is explicit. We will not learn from research that subjects did not know they were in.
No vendor-funded studies that predetermine the finding.
Grants and partnerships fund the question, not the answer. If the answer is required up front, the partnership is not for us.
The data that could support research, and the pipeline we will build.
The Foundation holds data about how people use the product. Most of it never leaves the individual user’s context and will never touch a research pipeline. The small subset that could, legitimately, support population-level research on wellbeing is gated behind explicit opt-in and aggregated before it leaves the database.
Our commitment is plain: an opted-in research contribution is aggregated, anonymized, and reviewed by the research committee before a single query runs against it. Individual conversations with the companion are never research material. Circle data is never research material.
Read the full data commitment at /transparency/data. It is the governing document when these two pages disagree.
How to propose a collaboration.
We welcome proposals from universities, public-health researchers, and independent scholars. A short email is enough to start. Include the question you want to study, the method, the population, and the IRB framework you intend to use. We will reply with a sense of fit within two weeks.
Mutual review is part of how we work. You are allowed to ask us hard questions about our data, our methods, and our assumptions. We will answer them in writing.
Write to hello@elitesgen.org with the subject “Research proposal.”
Include institutional affiliation, IRB status, proposed data scope, and a short description of the question.
The data, first.
The research question is only as good as the data that answers it. Read the Foundation's data commitment before you propose anything.