Building Smol Gardens for Accountable Tech

Smol Gardens is a design proposal by the Femmecubator team, introduced at BetaNYC's UnSchool of Data 2026 conference in March, as part of the Open Civic Tech initiative.

Inspired by Anil Dash's 2012 essay "The Web We Lost," which critiqued the state of the internet in 2012 as isolated, “walled gardens,” Smol Gardens challenges civic technologists to build responsibly with AI, but also, to consider alternatives such as small language models (SLMs), or reusable, evergreen solutions (is AI needed at all?)


Our North Star

“Can we build responsibly in the age of AI?

What alternatives exist that don't create redundant, unmaintainable, or harmful solutions?”

Civic tech builders face an urgent crisis: rapid AI-driven development such as vibecoding culture, promises speed without understanding the tradeoffs and consequences. It is creating redundant systems, technical debt, and unmaintainable tools, while policy lags far behind deployment and civic technologists face pressure to sacrifice responsibility.

Historically, designers and community members have been locked out of tool-building without technical collaborators. Vibecoding democratizes development, but we must ask at what cost. We acknowledge some value in embracing tech and innovation, we want to leverage this so civic tech work doesn't fall behind.

THE ACCOUNTABILITY GAP

Who owns an AI-built tool when it fails? Whose data is protected? What happens to communities served by hastily-built systems? These questions go unasked. Large organizations justify AI as cost-cutting, ignoring real harms: data center expansion, environmental costs, unauthorized creative labor use, and loss of community accountability.

CURRENT LANDSCAPE

The reality behind vibecoding for example, creates a new workflow culture that unravels many issues as it puts unnecessary strain on sustainability of processes, data management and cybersecurity.

Large organizations view AI as a cost-cutting tool, capable of replacing thousands of workers. Yet this narrative ignores deeper harms: massive data center expansion, environmental degradation, redundant code, technical debt, data sovereignty risks, and malware vectors.

Policy lags behind deployment. Ethical AI frameworks are trailing far behind the speed of innovation. AI is already being misused—through malvertising via vibecoded sites, unauthorized use of writers' and designers' work, and accelerating narratives about the diminishing value of human creative labor.

SMALL LANGUAGE MODELS (SLMs)

SLMs are efficient, compact AI models with millions to a few billion parameters—far fewer than the hundreds of billions in LLMs. They run locally on edge devices, work offline, and require minimal resources. This makes them ideal for specialized, mission-critical tasks while reducing dependence on Big Tech infrastructure. By choosing SLMs, we align our toolkit with our values: work small and local, create accessible and community-driven tools, and resist corporate dependence.

OUR PROPOSAL
Smol Gardens establishes a groundwork for iterative study through three phases:

  1. Impact Framework: Create data tracking measuring human, systems, and environmental consequences. Embed accountability into every decision—not as afterthought, but as foundation.

  2. Design Workshops: Invite civic technologists document actual AI usage through a "diary mission," revealing hidden costs and building collective responsibility.

  3. Accountable Tech Platform: Use small language models (SLMs) to build a curated repository of civic tech tools reviewed for ethical impact and community benefit. Every tool documents its tradeoffs

Guiding Principles

PRIORITIZE SMALL, INDEPENDENT TOOLS

We choose small language models like Hugging Face's SmolLM that operate independently of Big Tech platforms. This reinforces "work small and local," creating accessible, community-driven tools rather than corporate-dependent systems.

PRACTICE SECOND-ORDER THINKING

We ask difficult questions: Who benefits? What's the real cost of inaction? What tradeoffs emerge for humans, systems, and the environment? This discipline ensures intentional building, not reactive momentum.

BUILD EVERGREEN SYSTEMS

We design reusable, openly available tools so others don't repeat the effort. This multiplies impact and builds collective knowledge for the civic tech community.

EMBRACE OPEN COLLABORATION

We invite builders to challenge our work, push it further, and improve it together through collaborative #goodvibing that makes our solutions stronger.

Forming the #accountabletech working group

Join the working group through the BetaNYC #accountable-tech-group channel. Once you’re in, please complete this survey to let us know how you’d want to be part of the conversation.

Peer-led working sessions

As part of our action plan, we will establish smaller working groups as we test this idea, and to progress the work.

Each group will have designated focus areas, report on progress using a shared backlog, and convene bi-weekly to discuss updates.


Interest Groups:

Data Management Group
Focus: Developing a tracking system and data index based on the initial checklist we have proposed.

Required expertise: Data analysis, data science, domain knowledge

Prototype Development Group
Focus: Design and implementation of a prototype system; collaboration with external partners to host design workshops to collect evidence, design iteratively and validate the data index tracker.

Required expertise: Design facilitation, design evaluation, development

Research and Policy Framework Group
Focus: Creating proposals for institutional frameworks and standards adoption

Required expertise: Policy creation and analysis, strategic planning, stakeholder engagement

Previous
Previous

Tool: Equity UI DS