Research in the Time of AI
- Author/Source: Paul Goldsmith-Pinkham (Substack / paulgp.com, 2026-03-16)
- Original: https://paulgp.substack.com/p/research-in-the-time-of-ai
Key Ideas¶
- Research should be understood as a pipeline of stages: ideation, design/feasibility, data assembly, core analysis, robustness, writing, posting/submission, and publication -- all iterative.
- Two main anxieties about AI in research: (1) a flood of low-quality "slop" including supercharged p-hacking, and (2) loss of academic jobs and career value as execution skills depreciate.
- LLMs compress the research timeline but do not railroad over the process -- each stage still requires human judgment and iteration.
- The skills AI replicates are execution skills (coding, data cleaning, drafting); the skills it does not replicate -- research taste, identification judgment, institutional knowledge -- have become more valuable.
- AI-assisted replication will likely offset p-hacking concerns, since revisiting old work becomes much cheaper.
- AI is a flashlight in the dark -- the hard part is still knowing where to walk.
Summary¶
Paul Goldsmith-Pinkham presents an ontology of the academic research pipeline in economics, from ideation through publication, and systematically examines where AI changes each stage. He identifies two main sources of anxiety: the fear of a flood of low-quality papers (especially via "one-shot" production from idea to posted paper) and the fear that AI depreciation of execution skills threatens academic careers. On the first point, he acknowledges that AI can churn through many stages rapidly but argues that posing the right question remains the most important and most human part of research. On p-hacking, he notes that AI tools cut both ways -- they make it easier to p-hack but also dramatically lower the cost of replication and scrutiny.
On the career anxiety, Goldsmith-Pinkham is empathetic but direct. He acknowledges the real depreciation of human capital invested in execution skills (programming, econometrics implementation, data work) but argues this reveals what was truly valuable all along: knowing which questions matter, having taste for credible research design, and understanding institutional settings deeply enough to know when data is misleading. He frames the situation as a cost reduction that "hones the question of what is good research."
The article concludes with a stage-by-stage description of how AI helps at each point in the research pipeline, from ideation (prompting toward ideas, like reading articles) through publication (easier replication packages, better visualizations). The overarching message is optimistic: AI expands the research frontier, compresses timelines, and democratizes access to sophisticated methods, but the hard part -- research judgment -- remains firmly human.
Relevance to Economics Research¶
This is one of the most systematic treatments of how AI affects the economics research production function. The pipeline ontology is immediately useful for thinking about where to deploy AI tools in one's own workflow. The distinction between execution skills (depreciating) and judgment skills (appreciating) is especially relevant for PhD students and early-career researchers making human capital investment decisions.
Related Concepts¶
- concepts/vibe-research
- concepts/ai-research-tools
- concepts/research-quality
- concepts/human-in-the-loop
- concepts/empirical-methods