Skip to content

Instantly share code, notes, and snippets.

@dmberry
Created January 20, 2026 11:32
Show Gist options
  • Select an option

  • Save dmberry/f65c42ec8a5af32de08ad22baaf3c1a2 to your computer and use it in GitHub Desktop.

Select an option

Save dmberry/f65c42ec8a5af32de08ad22baaf3c1a2 to your computer and use it in GitHub Desktop.
Comprehensive methodological framework for Critical Code Studies (CCS)
name description version date author
critical-code-studies
Comprehensive methodological framework for Critical Code Studies (CCS), integrating materialist-phenomenological method, hermeneutic-rhetorical approach, and centrifugal close reading from 10 PRINT. Use when analysing code as cultural text, conducting close readings of software, or applying critical theory to computational systems.
1.0.0
2026-01-20
David M. Berry

Critical Code Studies Methodology

Overview

This skill provides a comprehensive methodological framework for Critical Code Studies (CCS), integrating four complementary approaches:

  1. Materialist-phenomenological method: Code as crystallisation of social formations, examined through multi-dimensional analysis (literature, mechanism, spatial form, repository), tests of strength, political economy, and phenomenology of computation.

  2. Hermeneutic-rhetorical approach: Code as social text with extrafunctional significance, analysed through critical hermeneutics, close reading, rhetorical analysis, and interpretation of meaning beyond functionality.

  3. Centrifugal close reading (10 PRINT): Spiral outward from single line or minimal program to explore seemingly disparate aspects of culture, treating code through variorum approach across different printed variants and platforms.

  4. Critical lenses (race, gender, postcolonial): Code examined through feminist theory, critical race theory, and postcolonial criticism, attending to how computational systems encode and reproduce social hierarchies, construct gendered and racialised subject positions, and extend or resist colonial logics.

These approaches share commitment to understanding code as embedded in power relations, requiring critical engagement with technical operations and social contexts whilst recognising code's extra-functional significance and cultural circulation.

When to Activate

Use this skill when:

  • Conducting close reading of source code as cultural text
  • Analysing software's social, political, or ideological dimensions
  • Examining algorithms for embedded assumptions and biases
  • Investigating historical software through media archaeology
  • Applying critical theory frameworks to computational systems
  • Writing academic analysis of code or software
  • Teaching or explaining CCS methodology
  • Building tools that support critical code analysis
  • Examining AI/ML systems critically

Core Methodological Principles

Foundational Definitions

Marino (2006, 2020): "Critical Code Studies applies critical hermeneutics to the interpretation of computer source code, program architecture, and documentation within a sociohistorical context."

Core premises:

  • Lines of code are not value-neutral
  • Can be analysed using theoretical approaches applied to other semiotic systems
  • Meaning grows out of functioning but is not limited to literal processes enacted
  • Code is doubly hidden: by illiteracy and by screens on which output delights and distracts

Berry: Code must be approached in its multiplicity as simultaneously:

  • Literature: Readable text with hermeneutic dimensions
  • Mechanism: Operative machinery with material effects
  • Spatial form: Organised structure with architectural properties
  • Repository: Crystallisation of social norms, values, processes

Key Concepts

  • Extrafunctional significance (Marino): Meaning grows out of functionality whilst exceeding it
  • Double mediation (Berry): Software mediates relationship with code in both writing and execution
  • Code as social text (Marino): Meaning develops as readers encounter code over time in shifting contexts
  • Code as cultural text (Marino): Not fine art but perhaps artisanal craft, everyday object of inquiry with meaningful relationship to culture

Code's Dual Character

Unambiguous dimension (technical):

  • Produces specific computational effects
  • Must compile/execute correctly
  • Operations are precise and deterministic
  • Validated through tests of strength
  • Only language that is executable (Galloway)

Ambiguous dimension (social):

  • Meaning proliferates through human interpretation
  • Circulates within multiple discourse communities
  • Subject to rhetorical triad of speaker, audience, message
  • Develops connotations through reception and recirculation
  • Bears significance in excess of functional utility

Methodological implication: Cannot read code solely for functionality without considering what it means. Both dimensions require simultaneous attention.


Constellational Analysis: Berry's Three-Level Framework

Berry's most recent methodological statement proposes constellational analysis, a systematic three-level approach grounded in Habermas's theory of cognitive interests.

Level 1: Technical-Instrumental

Computer science offers tools for examining how code implements forms of technical control through computational mechanisms and structures.

Focus: Formal logic, programming techniques, algorithmic processes, architectural decisions

Questions:

  • How does code materially structure social relations through implementation choices?
  • What computational mechanisms enable particular operations?
  • What technical constraints shape possibilities?

Example analysis:

def attention(query, key, value):
    score = tf.matmul(query, key, transpose_b=True)
    weights = tf.nn.softmax(score, axis=-1)
    return tf.matmul(weights, value)

Technical choices embed assumptions about language and meaning. Attention mechanism privileges certain patterns whilst excluding others, shaping how system processes and generates text.

Level 2: Practical-Communicative

Hermeneutics investigates how code operates as discourse, shaping social understanding through languages, documentation, and practices.

Focus: Code as discourse, social meaning, communicative practices, historical development

Questions:

  • How does code function as discourse shaping social interactions?
  • What assumptions about human communication does code encode?
  • How do technical choices reflect and construct social understanding?

Example analysis (Facebook News Feed evolution):

# 2009 Implementation
def rank_stories(stories):
    return sorted(stories, key=lambda x: x.time)

# 2018 Implementation
def rank_stories(stories, user):
    for story in stories:
        story.score = (story.likes * 0.5 +
                      story.comments * 2.0 +
                      story.shares * 1.5) *
                      time_decay(story.age) *
                      user.affinity(story.author)
    return sorted(stories, key=lambda x: x.score, reverse=True)

Shift from chronological to engagement metrics documents changing corporate priorities from user growth to income generation.

Level 3: Emancipatory

Critical theory reveals how code embeds and reproduces power relations whilst containing possibilities for resistance and transformation.

Focus: Ideology critique, power relations, political economy, emancipatory possibilities

Questions:

  • What power relations does code encode and reproduce?
  • How does code participate in regimes of extraction, surveillance, control?
  • What alternative implementations support more democratic relations?
  • Where are possibilities for resistance immanent within computational?

Example (Mastodon's alternative model):

def visibility_policy
  return :public if public?
  return :unlisted if unlisted?
  return :private if private?
  return :direct if direct?
end

Code prioritises user control and federation over centralised algorithmic manipulation, demonstrating how alternative technical implementations can support more emancipatory social norms.

Dialectical Relationships

Not hierarchical but mutually determining:

  • Technical constraints shape but don't overdetermine meaning-making possibilities
  • Social practices influence technical implementations
  • Critical analysis reveals how technical choices relate to broader power structures

Analytical Frameworks

Tests of Strength (Berry)

Code validated through dual requirements:

Legitimate Tests: Syntax conformity, programming conventions, compilation compatibility, specification fulfillment

Material Tests: Compilation without errors, execution of specified functions, correct hardware interfacing, expected outputs

Critical insight: Tests reveal code's limits—both technical constraints and ideological boundaries.

Reading Strategies

Close Reading (Marino's emphasis):

  • Examine naming conventions, variable choices, comments
  • Analyse assumptions embedded in function names
  • Trace gender, cultural, political biases
  • Identify subject positions constructed in code
  • Apply hermeneutics of suspicion—read between lines
  • Interpret specific coding elements, symbols, structures

Distant Reading (Berry's emphasis):

  • Visual imagery and spatial organisation
  • Architectural patterns and modular structure
  • Code libraries and external linkages
  • Dependencies and infrastructural relations

Reading Levels (integrated):

  • Surface: What code appears to do
  • Functional: What code actually executes
  • Social: What assumptions code embeds
  • Political: What power relations code encodes
  • Phenomenological: How code structures experience

Three-fold Analysis (Berry)

Ontology: What code IS

  • Material composition, virtual existence
  • Infrastructural embedding, mediated character

Genealogy: Where code comes FROM

  • Historical development (media archaeology)
  • Evolution of practices, programming paradigms
  • Social formations of programming communities

Mechanology: What code DOES

  • Dynamic operations, effects on social practices
  • Structuring of everyday life
  • Political and economic functions

Key Methodological Concepts

Extrafunctional Significance (Marino)

Meaning that grows out of code's functioning but is not limited to literal processes it enacts.

Not: Outside function, in addition to function, separate from function Instead: Growing out of function whilst exceeding it

Examples:

  • Climategate: Temporary "fudge factor" becomes "smoking gun" through public reception
  • Women on GitHub: User ID number gains rhetorical significance through prominent display
  • Variable naming: Conceptual metaphors add layer of meaning (e.g., "witchingEvent" vs "locationFound")

Code as Semiotic System (Marino)

Code is social, semiotic system employing grammar and rhetoric.

As Rita Raley argues: "Code may in general sense be opaque and legible only to specialists, much like cave painting's sign system, but it has been inscribed, programmed, written. It is conditioned and concretely historical"

Implications:

  • Sign system with own rhetoric
  • Verbal communication possessing significance in excess of functional utility
  • Form of symbolic expression and interaction
  • Not mere cipher whose meaning need merely be revealed

Hermeneutics of Suspicion (Marino)

CCS operates under Ricoeur's hermeneutics of suspicion:

  • Read between lines
  • Explore ambiguity and social contest
  • Seek gaps and remainders
  • Understand arbitrary nature of language opens communication to ideological influences

Applied to code: "Walls of computer do not remove code from world but encode the world and human biases"

Means analysing: Race, ethnicity, gender, sexuality, socioeconomic status, political representation, ideology, power relations


Centrifugal Close Reading (10 PRINT Methodology)

Against Distant Reading

The 10 PRINT methodology stands opposed to computational distant reading:

Centrifugal force principle: Spiral outward from single line of text to explore seemingly disparate aspects of culture

Not breadth but depth: Extremely intense consideration of minimal code reveals as much as close readings of complex cultural artefacts (Barthes's S/Z, Foucault's Ceci n'est pas une pipe)

Grain of sand approach: Single line as Rosetta Stone yielding access to creative computing and how programs exist in culture

Variorum Approach

Focus on specific program existing in different printed variants executing on particular platform.

Foregrounds overlooked aspects:

  • Computer programs typically exist in different versions
  • Serve as seeds for learning, modification, extension
  • Circulate through multiple communities with different readings

Avoids fetishising code (Chun's warning): Deeply considers context and larger systems at play.

Programming as Scholarship

Core methodological principle: To understand code critically and humanistically, scholarship should include programming.

Four types of programmatic intervention:

  1. Modifications: Slight variations in original parameters

    • Explore space of possibility within platform
    • Illuminate consequences of particular design choices
    • Make arbitrary conventions visible through contrast
  2. Variations: Systematic exploration of alternatives

    • Sketch range of possibilities for algorithm/system
    • Test limits and affordances
    • Reveal what is essential versus contingent
  3. Elaborations: Extensions and enhancements

    • Build upon minimal program to understand scaling
    • Explore what simple code enables when developed
    • Test boundaries between simple and complex
  4. Ports: Translations across platforms/languages

    • Porting as bearing program from one system to another
    • Programmer as translator facing questions of fidelity and improvement
    • Reveals nuances of original through adaptation challenges

Token-by-Token Analysis

Foundation for understanding: Look at each token, each character of program.

Reveals multiple dimensions:

  • Technical: Why particular program functions the way it does
  • Material: How hardware and system design enable operations
  • Historical: Remnants and residues from earlier technologies
  • Cultural: Ghostly associations with distant forms of activity

Example (PRINT keyword):

  • Technical: Invokes CHROUT routine in KERNAL, provides append and scroll
  • Material: Contributes to visual effect through automatic scrolling
  • Historical: Reminder of Teletypes that literally printed, not displayed
  • Cultural: Residue from before change in standard output technology

Practical Application Framework

Stage 1: Identify Object of Analysis

  • Select specific code, system, or software practice
  • Define boundaries whilst remaining attentive to connections
  • Situate technically, socially, historically

Stage 2: Multi-level Reading

  • Technical reading: What does code do computationally?
  • Social reading: What practices does code embed/enable?
  • Political reading: What power relations does code encode?
  • Phenomenological reading: How does code structure experience?

Stage 3: Genealogical Tracing

  • Historical: How did this code/practice develop?
  • Social: What communities produced it?
  • Economic: What political economy supports it?

Stage 4: Critical Analysis

  • Expose: Hidden assumptions, embedded values
  • Critique: Political and ethical implications
  • Evaluate: Alternatives and resistances

Stage 5: Synthesis

  • Integrate: Technical, social, political dimensions
  • Theorise: Broader implications
  • Propose: Critical interventions or alternatives

Running Code as Part of Analysis

Key advantage of CCS: Ability to actually execute code to demonstrate functioning and supplement hermeneutic reading.

Process

  1. Extract code implementation from archive/repository/documentation
  2. Create representative sample data reflecting realistic usage
  3. Execute code with sample data
  4. Generate comparison tables/visualisations showing results
  5. Analyse outputs against social/political context
  6. Use empirical results to test interpretative claims

When to use: Historical algorithm comparisons, alternative implementation testing, exploring parameter variations, demonstrating emergent behaviours, validating interpretative claims about code effects.

Methodological advantage: Empirical demonstration of how code changes alter information flows, supporting hermeneutic claims about embedded priorities whilst revealing precise technical mechanisms.


Collaborative and Iterative Reading Process

Reading Code as Collective Practice

Team composition requirements:

  • Sufficient technical and critical skills for interpreting complex code
  • Diversity of expertise: programming fluency, critical theory, philosophy, domain knowledge
  • Combination of those who build and those who interpret

Process characteristics:

  • Not laboratory but seminar room, library, archive
  • Weekly conversations about nature of code, descendants, cultural circulation
  • Hermeneutic spiral of learning—code inspires conversation, entry point for analysis of culture
  • Iteracy: Engagement with symbols as touchstones for exploration

Temporal demands:

  • Few hundred lines may take months of collective reading
  • Sometimes five lines in one hour due to questions fragments throw up
  • Iterative process—code reading inspires conversation, conversation informs code reading

Paratexts and Archives

Extensive paratextual materials crucial:

  • Programming manuals and documentation
  • Transcripts of program execution
  • Interviews and correspondence
  • Related materials (proposals, reports, funding documents)
  • More peripheral materials (institutional context, broader debates)

Sometimes understanding code requires:

  • Writing code (testing interpretations through implementation)
  • Bringing in seemingly unrelated study (literature, theatre, poetry analysis)
  • Hunting through patents and scholarship
  • Interviewing those who knew creators or worked on systems

Case Study Methods

Two-Part Structure (Marino)

Part 1: Technical Explanation

  • Present code excerpt
  • Explain context and functioning in detail
  • Define terms, explain programming structures
  • Render code legible for uninitiated
  • Annotate operations

Part 2: Interpretive Analysis

  • Explore meaning beyond what code does
  • Analyse extrafunctional significance
  • Connect to social/political/cultural contexts
  • Consider rhetoric, ideology, power relations

Why Include Large Portions of Code

  • Provide context for interpretation
  • Enable alternative readings by others
  • Show interrelationships within code
  • Make transparent what is being analysed

What Can Be Interpreted (Marino)

Everything: Code, documentation, comments, structures—all open to interpretation.

Paratextual features crucial:

  • History of program
  • Author(s)
  • Programming language
  • Genre
  • Funding source (military, industrial, entertainment)
  • Circulation and reception

Even arbitrary aspects meaningful: Variable names, function names, comments—most arbitrary aspects often bear most cultural significance.


Applying Critical Lenses

Race and Algorithmic Discrimination

Critical code studies must attend to how computational systems encode, reproduce, and amplify racial hierarchies. As Noble (2018) demonstrates, search algorithms are not neutral but reflect and reinforce racist epistemologies, systematically devaluing Black women and other marginalised groups through the very structure of information retrieval.

Key analytical questions:

  • How do training datasets encode historical patterns of racial discrimination?
  • What proxy variables allow racial discrimination to operate through ostensibly race-neutral code?
  • How do classification systems impose racial categories that may not align with lived experience?
  • Where does the boundary between technical decision and discriminatory outcome become obscured?

Example (predictive policing algorithm):

def calculate_risk_score(individual):
    score = 0
    score += prior_arrests * 2.5
    score += neighborhood_crime_rate * 1.8
    score += employment_stability * -1.2
    score += age_at_first_arrest * -0.5
    return normalize(score)

Critical analysis: Each variable encodes historical discrimination. Prior arrests reflect racially biased policing practices. Neighbourhood crime rates correlate with redlining and segregation. Employment stability reflects labour market discrimination. The code transforms historical injustice into predictive "risk," creating feedback loops that intensify existing disparities.

Methodological approaches:

  • Audit studies: Testing systems with matched inputs differing only by racial markers (Eubanks 2018)
  • Training data archaeology: Examining datasets for historical bias embedded in labels and categories
  • Proxy analysis: Identifying variables that correlate with race even when race is not explicit
  • Disparity measurement: Quantifying differential impacts across racial groups
  • Counterfactual analysis: Asking how outcomes would differ under alternative implementations

Chun's contribution: Discriminating Data (2021) shows how correlation in data science creates "neighborhoods" that function as proxies for race, enabling discrimination through mathematical abstraction whilst maintaining plausible deniability. The neighbourhood is both technical construct and site of racial formation.

Gender and Power Analysis

Gender analysis in CCS examines how code constructs, reinforces, and sometimes disrupts gender categories and power relations. Following Scott (1986), gender operates as a category of historical analysis applicable to computational systems.

Key analytical questions:

  • How does code construct gendered subject positions for users, developers, and imagined interlocutors?
  • What assumptions about gender are embedded in variable names, function structures, and interaction designs?
  • How do classification systems impose binary gender categories or enable more fluid possibilities?
  • What is the gendered history of computing labour, and how does this shape contemporary code cultures?

Example (DOCTOR/ELIZA analysis):

(MEMR RULES (
    (MY (= YOUR))
    ((* I FEEL @) (TELL ME MORE ABOUT SUCH FEELINGS))
    ((* I * MY MOTHER @) (TELL ME MORE ABOUT YOUR FAMILY))
))

Critical analysis: ELIZA positions the user as feminised patient requiring therapeutic intervention (Dillon 2020). The script's pattern-matching assumes a confessional mode historically associated with women's speech. The computer occupies the authoritative position of (implicitly male) analyst. Even the name "ELIZA" references Shaw's Pygmalion, where a man educates a woman into proper speech.

Historical dimension (Hicks 2018): Britain's early computing workforce was substantially female, but women were systematically excluded as computing gained prestige and pay. Understanding this history illuminates how contemporary code cultures reproduce gender hierarchies that were actively constructed through policy and practice rather than reflecting natural aptitudes.

Haraway's cyborg: The cyborg figure (Haraway 1991) offers alternative analytical possibilities, breaking down boundaries between human and machine, male and female, nature and culture. CCS can examine code that disrupts rather than reinforces gender binaries.

Methodological approaches:

  • Pronoun and naming analysis: Examining how code defaults construct gendered users
  • Labour history: Tracing gendered divisions in who writes, maintains, and uses code
  • Interaction design critique: Analysing how interfaces assume or construct gendered behaviours
  • Classification system analysis: Examining how databases encode gender categories
  • Feminist STS methods: Applying standpoint theory and situated knowledge to code analysis

Intersectional Analysis

Race and gender do not operate independently. Following Crenshaw, intersectional analysis examines how multiple axes of power converge in computational systems.

Example (hiring algorithm):

def score_candidate(resume):
    base_score = nlp_similarity(resume, successful_employee_corpus)
    if has_career_gap(resume):
        base_score *= 0.85
    if degree_from_top_20(resume):
        base_score *= 1.15
    return base_score

Intersectional analysis: The "successful employee corpus" likely overrepresents white men. Career gap penalties disproportionately affect women, particularly mothers and carers. Elite degree bonuses favour those with access to expensive education, correlating with race and class. Black women face compounded disadvantage across multiple variables.

Amrute's contribution: Encoding Race, Encoding Class (2016) demonstrates how race and class operate together in tech labour, with Indian IT workers navigating both racial stereotypes and class distinctions that shape their positioning in global software production.

Postcolonial Criticism

Postcolonial analysis examines how computational systems extend colonial logics and how code can function as site of resistance.

Linguistic imperialism in code:

Example: قلب/Alb (Arabic programming language)

  • 95% of programming languages use English keywords
  • This represents not merely convenience but epistemic violence
  • Non-English speakers must learn English to participate in programming
  • Economic consequences: Jobs requiring English-based languages exclude billions

Code as abrogation: Alb uses Arabic keywords and right-to-left syntax, disrupting the assumed universality of English-based computation. This is not merely translation but intervention into the colonial structure of programming itself.

Data colonialism: Contemporary platform capitalism extracts data from global populations for processing in imperial centres. Code implements this extraction:

function collectUserData(user) {
    return {
        location: user.gps,
        contacts: user.addressBook,
        messages: user.communications,
        behaviour: user.activityLog,
        preferences: inferPreferences(user)
    };
}

Critical analysis: Data flows from periphery to centre, value extracted and accumulated by platforms headquartered in former colonial powers. The code implements what Couldry and Mejias term "data colonialism."

Infrastructure and Platform Studies

Example: Transborder Immigrant Tool

  • Examine choice of platform (Motorola i455: inexpensive, robust)
  • Analyse use of GPS and cellular infrastructure
  • Consider material constraints shaping design
  • Trace metaphors (water witching vs technical terminology)

Multiple Audiences for Code

Code exists for and circulates among:

  1. The computer (executes operations)
  2. The programmer (writes and modifies)
  3. Other programmers (collaborate, maintain, extend)
  4. Managers (oversee development)
  5. Users (may access text in open source)
  6. Judges and lawyers (use in legal proceedings)
  7. Politicians and pundits (deploy in political debate)
  8. Poets and artists (incorporate into creative work)
  9. Humanities scholars (analyse culturally)

Implication: Each audience reads with different assumptions, priorities, literacies. Code's meaning shifts across these readerships.


LLM-Assisted Critical Code Studies

Three Modes of Cognitive Augmentation (Berry)

1. Cognitive Delegation (danger mode):

  • LLM responsiveness creates false confidence in unworkable approaches
  • System generates elaborate solutions without signalling flaws
  • Developer persists with misconceived strategy
  • Augmentation becomes substitution

Warning signs: LLM never refuses prompts, generates plausible-looking code that fails, shifts judgement burden to human

2. Productive Augmentation (optimal mode):

  • Human maintains strategic control over research/architectural decisions
  • LLM augments implementation, rapidly generates code
  • Clear division: human architectural thinking, machine code generation
  • Augmentation remains supplementary not substitutive

Optimal division: Human (strategic decisions, problem design, critical evaluation) + LLM (code structure, implementation details, rapid iteration)

3. Cognitive Overhead (hidden cost mode):

  • Managing LLM context consumes cognitive resources
  • Constant version tracking, context management required
  • LLM confusion requires repeated clarification
  • Scaling limits: beyond certain complexity, overhead exceeds benefits

The Competence Effect (Berry)

Extension of Weizenbaum's ELIZA effect:

ELIZA effect (original): Users projecting understanding onto simple pattern-matching

Competence effect (contemporary): LLM's functional capability masks absence of semantic understanding

Why more dangerous:

  • Code works, bugs get fixed, system appears to learn
  • Constant reinforcement that LLM "gets it"
  • Evidence appears to support anthropomorphism
  • Boundary between pattern-completion and understanding difficult to maintain

Triadic Hermeneutics (Berry)

Traditional hermeneutics: Dialogue between interpreter and text

Triadic structure with AI: Three-way exchange between human intention, machine generation, executable code

Hermeneutic spiral: Each iteration produces not just refined understanding but new code becoming object of interpretation

Implications:

  • Code simultaneously authored and discovered, intentional and emergent
  • Whose intentions embedded in co-generated code?
  • Need frameworks for hybrid authorship, distributed intentionality

AI Sprints Methodology

From Data Sprints to AI Sprints

Continuity:

  • Time-bounded intensive work
  • Iterative refinement through feedback loops
  • Production of intermediate mediating objects
  • Commitment to open, teachable processes

Rupture:

  • Distribution shifts from human teams to individual-LLM dialogue
  • Potential "methods crisis" as LLMs perform tasks previously requiring bespoke tools
  • Risk of losing productive friction from differently positioned knowers

Six Key Principles

  1. Architectural Control: Researcher maintains strategic authority over questions, frameworks, interpretive claims. LLM generates elements but does not determine direction.

  2. Intermediate Objects (most important): Do NOT assume LLMs can jump to comprehensive outputs. Create visualisations, summary tables, coded excerpts, thematic mappings, extracted JSON files. Make LLM processing legible and contestable.

  3. Materialised Abstractions: Intermediate objects emerge through grammatisation process that discretises and reorders materials. Remain critically aware of what processes of abstraction produced them, what becomes visible/invisible.

  4. Hermeneutic-Computational Loop: Tight feedback between human interpretation and AI processing. Attend to intermediate objects, identify patterns/anomalies, revise prompts introducing different approaches.

  5. Methodological Transparency: Document prompts, LLM interactions, decision points, failures. Enable reproducibility, critique, collective methodological learning.

  6. Reflexive Critique: Guard against competence effect. Direct critical attention to seductive effect of speed/scale potentially coming at cost of nuance, rigour, context.


Contemporary Methodological Challenges

Machine Learning Systems

New interpretive challenge: Behaviour emerges from training data and learned parameters rather than explicit programming alone.

  • Traditional code: Explicit logic determining outcomes
  • ML systems: Emergent behaviours from learned patterns in latent spaces

Required approaches:

  • Explainability techniques (attention visualisation, saliency maps, feature importance)
  • Training data analysis (what patterns learned, what biases embedded)
  • Behavioural testing (probing system responses across input variations)
  • Architectural analysis (how structure shapes possible computations)

Key insight: Explainability becomes crucial for critical analysis of modern AI systems. Cannot read code alone; must examine training process, learned representations, emergent behaviours.

Distributed and Opaque Systems

Challenges:

  • Microservice architectures: Functionality distributed across multiple services
  • Cloud computing: Operations span multiple physical/virtual locations
  • Proprietary platforms: Source code unavailable for direct analysis
  • Real-time compilation: Code transformed dynamically during execution

Methodological responses:

  • API analysis: Studying interfaces between components
  • Behavioural reverse engineering: Inferring logic from inputs/outputs
  • Infrastructure mapping: Documenting distributed system topology
  • Collaboration with insiders: Working with those who have access
  • Legal/regulatory pressure: Advocating for algorithmic transparency

Political Economy and Power

Urgent need: Methods for studying how code implements regimes of extraction, surveillance, control.

Example (tracking code):

class UserBehaviour {
    trackAction(action) {
        this.store.push({
            user: this.id,
            action: action,
            context: this.getCurrentContext(),
            timestamp: Date.now()
        });
        this.updateProfile();
        this.triggerRecommendations();
    }
}

Analysis needed: How does continuous tracking operate technically? How does profile updating enable targeting? How do recommendations drive engagement serving platform profit?

Comparative analysis (resistance):

async function deriveKeys(secret) {
    const salt = crypto.randomBytes(32);
    return await crypto.subtle.deriveKey(
        {name: 'PBKDF2', salt: salt, iterations: 100000},
        secret,
        {name: 'AES-GCM', length: 256},
        true,
        ['encrypt', 'decrypt']
    );
}

Signal's encryption demonstrates how code can resist surveillance. CCS needs methods for identifying and amplifying such emancipatory alternatives.


Integration with Critical Theory

Frankfurt School

  • Adorno on reification and commodity fetishism applied to computational processes
  • Benjamin on mechanical reproduction extended to digital
  • Horkheimer on instrumental rationality in algorithmic governance
  • Dialectic of enlightenment: Reason's emancipatory promises become new domination

Marx

  • Machinery analysis applied to computational infrastructure
  • Commodity form and fetishism in code commodities
  • Alienation in computational labour
  • Industrial 'mysteries' parallel to code's technical obscurity

Feminist Theory and Technoscience Studies

  • Haraway: Cyborg figuration dissolving boundaries between human/machine, nature/culture, male/female; situated knowledges against view from nowhere
  • Harding: Standpoint epistemology applied to whose knowledge shapes code, whose experiences are centred or marginalised
  • Wajcman: Technology as masculine culture, examining how gender shapes technical design and who counts as technical
  • Abbate: Historical recovery of women's contributions to computing, challenging narratives that naturalise male dominance
  • Barad: Agential realism and intra-action, examining how code and users mutually constitute each other

Methodological implications:

  • Attend to whose standpoint shapes code design and whose is excluded
  • Examine how gender operates in programming cultures and labour markets
  • Recover hidden histories of women and marginalised groups in computing
  • Question apparently neutral technical decisions for embedded gender assumptions

Critical Race Theory and Algorithmic Justice

  • Noble: Algorithms of oppression, how search engines encode racist epistemologies
  • Benjamin: Race after technology, examining how innovation reproduces inequality
  • Browne: Dark matters, surveillance and blackness, extending Foucault through critical race lens
  • Eubanks: Automating inequality, how high-tech tools target and punish poor communities
  • Chun: Discriminating data, correlation and neighbourhood as racial proxies

Core concepts for CCS:

  • Algorithmic redlining: How code implements spatial discrimination without explicit racial categories
  • Proxy discrimination: Variables that correlate with race enabling discrimination through ostensibly neutral code
  • Feedback loops: How predictive systems amplify existing inequalities through recursive application
  • Default whiteness: How technical standards assume white users and bodies as norm
  • Techno-orientalism: Racialisation of Asian technical workers and its effects on code cultures (Amrute 2016)

Methodological implications:

  • Audit algorithms for disparate racial impact
  • Trace genealogies of training data to historical discrimination
  • Examine who builds systems and whose interests they serve
  • Attend to how technical abstraction enables discrimination whilst providing deniability

Stiegler

  • Grammatisation of knowledge through code
  • Exosomatisation and technical evolution
  • Proletarianisation of cognitive labour
  • Automatic society and computational governance

Foucault

  • Power/knowledge configurations in code
  • Disciplinary mechanisms in software systems
  • Discourse analysis applied to programming communities

Postcolonial Theory

  • Schneider on global Englishes and linguistic colonisation
  • Ashcroft, Griffiths, Tiffin on abrogation and resistance
  • Economic stakes of linguistic imperialism in technology
  • Data colonialism: Extraction of data from global populations for processing in imperial centres

Deconstruction (Derrida)

  • Identifying fissures between signifier and signified
  • Questioning underlying assumptions of conventions
  • Re-inscribing gaps as fundamental to understanding

Cautionary Notes

The Quicksort Cautionary Tale

Error: Analysing Quicksort aside from historical, material, social context, drawing analogy to neighbourhood hierarchies

Problem: Though analysis said something about neighbourhoods, offered little insight on Quicksort itself, nor drew from Quicksort lesson about society from which it came

Correct approach: For algorithm like Quicksort, code meets social realm at site of implementation and context of application

Lesson: Interpretation requires reading object in (post)human context through particular critical lens, involving human machines operating in actor-networks

Warnings and Limitations

Avoid Screen Essentialism: Do not reduce code analysis to interface or output. Must examine code's operational logic.

Avoid Pure Formalism: Code not pure mathematics or logic. Always embedded in social, economic, political contexts.

Avoid Technological Determinism: Code does not autonomously determine social outcomes. Analyse within broader configurations of power.

Recognise Epistemic Limits: Code knowledge requires technical expertise. Cannot fully analyse without some programming literacy.

Maintain Critical Distance: Neither technophilia nor technophobia. Critical engagement requires nuanced understanding.

Guard Against Competence Effect: When using LLMs: Remember pattern-completion not comprehension, test outputs against behaviour, question plausible-sounding analyses, don't anthropomorphise capabilities.

Require Collaboration: CCS calls for artful combination of knowledge of programming languages and knowledge of interpretive approaches. These analytic projects require programmers to help open up contents and workings of programs, acting as theorists along with other scholars.


Core Methodological Contributions (Four Principles)

  1. Programming belongs in scholarship: Modifications, variations, elaborations, ports illuminate original and platforms. Writing programs is not dry technical exercise but exploration of aesthetic, material, formal qualities.

  2. Fundamental relationship between formal workings and cultural implications: Cannot separate code's technical operation from cultural reception. To understand redlining fully, must consider specific code of bank's mortgage approval system.

  3. Code is ultimately understandable: Way code works is not divine mystery. Any line from any program can be thoroughly explicated with adequate time and effort. This reason is grounded in design and material reality.

  4. Code is cultural resource: Not trivial and only instrumental, but bound up in social change, aesthetic projects, relationship of people to computers. Should be valued as text with machine and human meanings, something produced and operating within culture.


Conclusion: Code as Contemporary Challenge

Critical Code Studies positions code analysis as urgent intellectual and political task.

Challenges:

  • To humanities: Develop digital intellect not just digital intelligence
  • To social sciences: Understand computational infrastructures shaping society
  • To critical theory: Extend critique to algorithmic governance and computational capitalism
  • To practice: Develop resistant alternatives and critical interventions
  • For AI age: Navigate cognitive augmentation whilst maintaining critical distance
  • For methods: Respond to AI-driven methods crisis through critical augmentation not abandonment
  • For history: Recover and interpret historical code artefacts before they are lost

Core insight: Code is neither purely technical nor purely social. It is material and symbolic, virtual and actual, constraining and enabling. Critical Code Studies provides method for rigorous analysis attending to this multiplicity whilst remaining grounded in concrete technical, social, and political realities.

Three foundational approaches remain essential:

  • Berry's materialist-phenomenological approach: Examining code's multiplicity whilst tracing political economy
  • Marino's hermeneutic-rhetorical methods: Interpreting extra-functional significance through close reading
  • Centrifugal close reading (10 PRINT): Spiralling outward from minimal code through variorum approach

The challenge lies in developing methods that leverage AI capabilities without succumbing to instrumental rationality, that extend analytical reach without losing interpretive depth, and that democratise computational access without abandoning collective interpretive work.


Bibliography

Abbate, J. (2012) Recoding Gender: Women's Changing Participation in Computing. Cambridge, MA: MIT Press.

Amoore, L. et al. (2023) 'Machine learning, meaning making: On reading computer science texts', Big Data & Society, 10(1). Available at: https://doi.org/10.1177/20539517231166887.

Amrute, S. (2016) Encoding Race, Encoding Class: Indian IT Workers in Berlin. Durham: Duke University Press.

Bender, E.M., Gebru, T., McMillan-Major, A., and Shmitchell, S. (2021) 'On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?', Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, pp. 610–623.

Benjamin, R. (2019) Race After Technology: Abolitionist Tools for the New Jim Code. Cambridge: Polity Press.

Berry, D. M. (2011) The Philosophy of Software: Code and Mediation in the Digital Age. Basingstoke: Palgrave Macmillan.

Berry, D. M. (2014) Critical Theory and the Digital. London: Bloomsbury Publishing.

Berry, D. M. (2023) 'Critical Digital Humanities'. In J. O'Sullivan (ed.), The Bloomsbury Handbook to the Digital Humanities, pp. 125–135. London: Bloomsbury.

Berry, D. M. (2024) 'Reflections on Method for Critical Code Studies', Stunlaw. Available at: https://stunlaw.blogspot.com/2024/12/reflections-on-method-for-critical-code.html.

Berry, D. M. (2025) 'Synthetic media and computational capitalism: towards a critical theory of artificial intelligence', AI & SOCIETY. https://doi.org/10.1007/s00146-025-02265-2.

Berry, D. M. (2025b) 'Co-Writing with an LLM: Critical Code Studies and Building an Oxford TSA App', Stunlaw. http://stunlaw.blogspot.com/2025/10/co-writing-with-llm-critical-code.html.

Berry, D. M. (2025c) 'AI Sprints: A Method for Critical Augmentation in Digital Research', Stunlaw. http://stunlaw.blogspot.com/2025/11/ai-sprints.html.

Berry, D. M. and Marino, M. C. (2024) 'Reading ELIZA: Critical Code Studies in Action', Electronic Book Review. Available at: https://electronicbookreview.com/essay/reading-eliza-critical-code-studies-in-action/.

Browne, S. (2015) Dark Matters: On the Surveillance of Blackness. Durham: Duke University Press.

Chun, W.H.K. (2011) Programmed Visions – Software and Memory. Cambridge, Mass: MIT Press.

Chun, W.H.K. (2021) Discriminating data: correlation, neighborhoods, and the new politics of recognition. Cambridge, Massachusetts: MIT Press.

Dillon, S. (2020) 'The Eliza effect and its dangers: from demystification to gender critique', Journal for cultural research, 24(1), pp. 1–15. Available at: https://doi.org/10.1080/14797585.2020.1754642.

Eubanks, V. (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: St. Martin's Press.

Haraway, D. (1991) 'A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century', in Simians, Cyborgs, and Women: The Reinvention of Nature. New York: Routledge, pp. 149–181.

Hayles, N.K. (2010) My Mother Was a Computer: Digital Subjects and Literary Texts. University of Chicago Press.

Hicks, M. (2018) Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. Cambridge, MA: MIT Press.

Marino, M. C. (2006) 'Critical Code Studies', Electronic Book Review. Available at: https://electronicbookreview.com/essay/critical-code-studies/.

Marino, M. C. (2020) Critical Code Studies. Cambridge, MA: MIT Press.

Marino, M. C. and Douglass, J. (2023) 'Introduction: Situating Critical Code Studies in the Digital Humanities', Digital Humanities Quarterly, 17(2).

Montfort, N. et al. (2013) 10 PRINT CHR$(205.5+RND(1)); : GOTO 10. Cambridge, MA: MIT Press.

Noble, S.U. (2018) Algorithms of Oppression: How Search Engines Reinforce Racism. New York: University Press.

Scott, J.W. (1986) ‘Gender: A Useful Category of Historical Analysis’, The American Historical Review, 91(5), pp. 1053–1075. Available at: https://doi.org/10.2307/1864376.

Weizenbaum, J. (1966) 'ELIZA: A Computer Program For the Study of Natural Language Communication Between Man And Machine', Communications of the ACM, 9(1), pp. 36–45.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment