How Community Technology Centers Iterate on Digital Literacy Programs
The Moving Target Problem
Community technology centers exist to reduce digital divides: the gaps in digital skills, access, and literacy that correlate with age, income, educational attainment, disability, language, and geography. The premise of this work is that digital skills are practically important — that lacking them places people at a disadvantage in accessing services, employment, education, and civic participation.
This premise is correct. It is also constantly shifting in its specific implications. "Digital skills" in 2000 meant email and basic web navigation. In 2010 it meant those things plus social media, basic online security practices, and mobile device use. In 2020 it meant those things plus video conferencing, recognizing misinformation, navigating digital health records, and using a widening array of government services that had moved online. In 2025 it means all of those things plus understanding AI-assisted tools, managing algorithmic curation of information, recognizing AI-generated synthetic media, and using increasingly AI-embedded applications across domains.
Each of these shifts creates new gaps between what community technology centers teach and what community members need to know. Centers that are not continuously revising their programs are perpetually teaching yesterday's digital world to people who must navigate today's.
The problem is compounded by the resource constraints of most community technology centers. These are typically underfunded organizations, operating on thin margins, with staff who are passionate but not always technology specialists. They do not have the capacity for the kind of continuous environmental scanning and curriculum development that technology companies conduct as a core business function. They need iterative approaches that are sustainable at their scale and resource level.
Sources of Evidence for Iteration
Useful iteration requires useful data. The data that matters for community technology center program revision comes from several sources, each capturing a different dimension of the gap between current offerings and current need.
In-program observation. Instructors who pay close attention to where participants get stuck — which exercises produce confusion, which concepts require the most repetition, which real-world tasks participants cannot accomplish despite completing the curriculum — have access to granular data about curriculum gaps. Systematizing this observation, rather than leaving it to individual instructor memory, is a low-cost way to generate consistent evidence. A simple log of "questions I could not answer" or "moments where participants could not apply the skill to a real task" maintained by instructors across cohorts produces patterns over time.
Post-program follow-up. Satisfaction surveys administered at the end of a course measure participant affect, not skill transfer. To assess whether the program actually produces the intended outcomes, follow-up contact with participants four to eight weeks after completion is necessary. This follow-up can be brief — a structured phone call or short survey — and need not cover every participant in every cohort. A consistent sample over time produces data on skill retention and real-world application that is far more useful for program revision than end-of-course satisfaction scores.
Community referral patterns. When community members who have not attended the center's programs appear at the center seeking help with a specific digital problem, the nature of their problem is data. A pattern of walk-ins struggling with a specific platform, type of scam, or government digital service indicates a gap in community digital literacy that the center's programs may not be addressing. Tracking these walk-in requests and comparing them to the current curriculum reveals whether the curriculum is aligned with community need.
Partnerships with service organizations. Organizations that serve the same population as the technology center — housing assistance agencies, workforce development programs, social services organizations — encounter digital barriers in their clients' lives daily. A workforce development organization that consistently finds that its job seekers cannot navigate online application systems, despite completing digital literacy training, is detecting a specific skills gap. Building structured information-sharing with partner organizations creates an external observation network that amplifies the center's own data collection.
Technology practitioner input. Community members who work professionally in technology — often underutilized as resources by community organizations — can provide current-state knowledge about the technology landscape that center staff may lack. This includes knowledge about which platforms are actually being used by employers, which digital tools are becoming standard in fields that center participants are trying to enter, and which emerging technologies are likely to create new literacy requirements in the near term. Building relationships with local technology practitioners who can serve as informal curriculum advisors is a low-cost way to access this expertise.
Curriculum Architecture for Iteration
The design of a curriculum significantly affects how easily it can be revised. Curricula designed as integrated, linear sequences — where content builds on prior content throughout the program — are difficult to revise partially. Changing one element requires assessing its downstream effects on all subsequent elements. This creates strong inertia against revision.
Modular curricula, designed as largely independent units that can be combined in different sequences, are much easier to revise. When a specific module becomes outdated — when the platform it covers has been replaced, or when a new digital challenge has emerged that the module does not address — it can be updated or replaced without requiring revision of the entire curriculum. This makes iteration tractable even for organizations with limited curriculum development capacity.
Beyond modularity, certain curriculum design choices specifically support iterative revision:
Explicit statement of the skills each module develops. When a module's purpose is stated in terms of transferable skills rather than specific platform competencies, its relevance is easier to assess as the technology landscape changes. A module described as "using platform X to accomplish task Y" becomes obsolete when platform X changes significantly. A module described as "evaluating the credibility of online information" remains relevant even as the specific platforms and tools for doing so evolve.
Separation of concept from interface. Digital literacy includes both conceptual understanding (what is encryption, how does a phishing attack work, what does an algorithm optimize for) and procedural fluency (how to accomplish specific tasks in specific applications). Conceptual content tends to remain relevant longer than procedural content. Organizing curriculum to distinguish these two layers makes it easier to update procedural content as applications change while preserving the more durable conceptual foundation.
Documentation of revision history. Maintaining a record of when specific modules were revised and why — what evidence prompted the revision and what changes were made — creates institutional memory that survives staff turnover. Staff who join the center after a revision can understand why the current curriculum exists in its current form, and can assess future revision needs against a baseline of prior decisions.
Iteration Processes at Sustainable Scale
The challenge for most community technology centers is building iteration processes that are genuinely sustainable given their resource constraints. A quarterly curriculum review conducted by a dedicated curriculum development team may be the ideal; it is not achievable for most centers operating with one or two program staff.
Sustainable iteration at small scale requires several design choices:
Staged revision cycles. Rather than attempting to revise all program content simultaneously, cycling through curriculum elements on a schedule — revising the cybersecurity module this year, the job search module next year, the benefits navigation module the year after — makes the work tractable. The cycle length depends on how quickly specific content becomes obsolete; some elements may need annual review while others can go longer.
Revision triggers as distinct from scheduled cycles. Some revisions should happen on a schedule regardless of whether evidence of obsolescence has accumulated. Others should happen when specific triggers occur: a major platform change, a new government service moving online, an identified pattern of community members being harmed by a scam the curriculum does not address. Building explicit revision triggers into the program management process ensures that urgent needs are addressed without waiting for the next scheduled review.
Participant involvement in curriculum review. Former participants who have been active in the community since completing the program have direct knowledge of how the skills they learned have held up against the digital challenges they have encountered. Including former participants in periodic curriculum review sessions — even informally — provides perspective that staff observations cannot capture.
Peer exchange with other centers. Community technology centers face similar challenges and have limited capacity for individual curriculum development. Regional and national networks — NTEN, the Digital Equity Initiative, local library consortia — provide opportunities for centers to share curriculum elements, revision strategies, and evidence about what is working. A center that can contribute its job search module to a shared pool and receive updated cybersecurity content in return is doing less total work than a center that must develop all content independently.
Measuring Whether Iteration Is Working
The ultimate test of an iterative program is whether the revisions actually improve outcomes. This requires clarity about what outcomes matter and how they are measured.
The most relevant outcomes for community technology center programs are not participation rates or course completion rates, though these are commonly reported because they are easy to measure. They are:
Skill application. Do participants use the skills they learned in their actual digital lives? Follow-up data that assesses this — even imperfectly, through self-report or through proxy measures like successful completion of the specific digital task that motivated participation — is far more useful for program revision than completion counts.
Harm reduction. Are participants who complete digital literacy programs less likely to fall victim to online scams, identity theft, misinformation, and other digital harms? This is harder to measure but worth attempting, because a program that fails to reduce digital harms — even if participants rate it highly — is not producing its intended effect.
Goal achievement. Did participants accomplish the specific digital goal that brought them to the program — applying for a job online, navigating a government benefits portal, video calling family members, participating in a telehealth appointment? Tracking goal achievement rates at follow-up provides a direct measure of program effectiveness against the actual needs that motivated participation.
When iteration is working, these outcomes should improve over time. When they are not improving despite ongoing program revision, the data signals that either the revision process is not identifying the right problems or the changes being made are not addressing the root causes of program failure. Both diagnoses are useful, and both are only available to organizations that are measuring outcomes rather than activities.
Community technology centers that build genuine iterative practice into their operations are doing something difficult and important. They are attempting to close the gap between what the digital world requires of people who are historically excluded from its benefits and what these centers are actually able to teach. Iteration does not guarantee success in that effort. But the absence of iteration guarantees that the gap grows wider every year.
Comments
Sign in to join the conversation.
Be the first to share how this landed.