March 2026

The developers everyone wants to hire are the hardest to find

High-skill, low-visibility engineers — why the best candidates are invisible to traditional sourcing, and how contribution-pattern analysis finds them.

The conventional wisdom in technical recruiting is to look for engineers who have a large Twitter following, a blog with thousands of subscribers, and a GitHub profile full of starred projects. This logic is understandable. Visible engineers are easy to find, and easy-to-find is good for recruiters under quota pressure.

The problem is that visibility and engineering skill are not the same thing, and in many cases they actively trade off against each other.

Why the most capable engineers are often the least visible

Engineering skill compounds through repetition and depth. The engineers who are best at what they do tend to be doing it constantly — shipping code, reviewing pull requests, debugging edge cases at 11pm, writing tests nobody asked for. That leaves little time for building a personal brand.

The developers I consistently see misidentified as "mid-level" by resume screens and recruiter filters are often the ones running core infrastructure for projects that millions of people depend on, with 12 followers on GitHub and a Twitter account they signed up for in 2015 and never posted on.

There's a name for this in open source circles: the "silent contributor" problem. The curl maintainer Daniel Stenberg has been writing and maintaining one of the most-used pieces of software in the world for 25+ years. His GitHub profile is not especially impressive looking if you're skimming for stars and followers. You have to actually look at what he's built and maintained to understand what you're seeing.

Most recruiting workflows never get that far.

What "low-visibility" actually looks like in contribution data

The developers who slip through traditional sourcing tend to share a few characteristics when you look at their actual activity.

Their commits are dense and purposeful. Where a visibility-optimizing developer might push frequently to look active, the low-visibility engineer tends to push less often, but each commit does more. Their messages explain reasoning, not just action. "Fix null pointer dereference when auth token expires" versus "auth fix." The latter could mean anything; the former tells you the person understands the failure mode and fixed the underlying cause.

Their repository count is often low and focused. A developer with three repos they've maintained consistently for five years is frequently more capable than one with 50 repos where nothing has been touched in two years. Breadth is easy to fake; sustained maintenance takes actual commitment.

They leave traces in other people's projects. Some of the best engineers I've seen sourced had almost nothing in their own repos — their work was almost entirely in contributions to other projects. Issues they filed with precise reproduction steps. PRs that fixed something that had been broken for months. Review comments that caught edge cases the original author had missed. That kind of contribution doesn't show up in a profile star count, but it shows up in the commit graph if you look.

Their forks are different from their originals. A fork that has diverged significantly from its upstream, especially if those changes are organized and maintained over time, is a strong signal. It means the developer understood the codebase well enough to improve it in specific ways, and cared enough to maintain those changes.

Why traditional sourcing misses them entirely

Boolean search on LinkedIn does not find these people, because their profiles are often sparse or outdated. "Senior Software Engineer at [company they've been at for six years]" with skills listed as "Python, JavaScript, Linux" tells you almost nothing about what they can actually do.

The LinkedIn profile also tends to be the least-maintained artifact in a quiet developer's life. They update it when they switch jobs, which for a highly-tenured engineer might be every five or six years. The skills section reflects whatever they were working on when they last thought about it. "Machine learning" might mean they read a paper about it in 2019. The reverse is also true: a developer who has spent the last three years doing deep embedded work in C might not bother listing it because it feels too obvious to mention.

Keyword scraping of GitHub doesn't help much either, because the most impactful repositories often use precise technical language rather than buzzword-friendly terminology. A project described as "a memory-safe implementation of the LZ77 compression algorithm with SIMD optimizations for x86-64" won't surface in a search for "performance engineering" or "systems software."

Even GitHub's own star and follower metrics mislead here. Stars are a measure of discoverability and marketing, not quality. A well-promoted blog post announcing a new project will generate more stars in a week than a genuinely better tool maintained quietly for years. Developers who are good at self-promotion are often not the same developers who are good at software.

How contribution-pattern analysis surfaces them

The useful signals are behavioral, not reputational. They live in the event stream: the actual record of what someone did, when, in which projects, and how.

A developer who has opened 40 issues across 15 different projects over three years, with issue descriptions that include minimal reproduction cases and proposed fixes, is demonstrating something real. It shows they understand codebases they didn't write, and that they communicate clearly enough to make a bug useful to the people who have to fix it.

A developer who has reviewed 200+ pull requests in a project they don't maintain (a key seniority signal), reading other people's code carefully and leaving comments that catch logic errors rather than just style problems, is demonstrating collaboration skill that almost never shows up anywhere else in a profile.

This is the kind of analysis tools like riem.ai are built for: ingesting the raw event data across millions of repositories and surfacing contribution patterns that don't require a developer to have marketed themselves. A developer with 14 followers who has been the second-most-active contributor to a Rust async runtime for two years shows up in that data clearly. The signal is the work itself, not the work's reception.

The other place to look is dependencies. If your backend uses a specific ORM, serialization library, or observability SDK, the people who have contributed non-trivial code to that project understand your stack at a depth that few candidates hired through normal channels do. Chances are they've read the source code, filed a bug or two against it, maybe fixed one. Sourcing from the contributor lists of your own dependencies is one of the most underused tactics in technical recruiting.

What to do with this when you're sourcing

Once you find a low-visibility developer worth contacting, the outreach has to be different from what you'd send to someone with a polished personal brand.

A developer who has never tried to be found does not respond well to "I came across your profile and think you'd be a great fit for our team." They know you didn't come across anything; you ran a search. That opener reads as noise and gets ignored.

What works is specificity about the actual work. "I saw your contribution to the tokio runtime's scheduler (specifically the change you made to reduce lock contention under high concurrency). We're working on a similar problem and would love to talk about how you approached it." That requires you to actually have read what they did. But it's the only message that cuts through.

The follow-up response rate difference between generic and specific outreach is documented in enough recruiter retrospectives now to count as established practice. The number I see cited most often is somewhere between three and five times higher for personalized messages that reference actual code. That's not surprising once you think about it from the developer's side: if someone is going to interrupt your day, they should at least show they've done the basic work of understanding what you do.

What "underrated" really means

The framing of "underrated developer" sometimes implies the developer is being held back unfairly, that there's some injustice to correct. That's occasionally true. More often, the developer is doing exactly what they want to be doing: writing software, not marketing themselves.

The problem isn't that they're underrated. The problem is that the sourcing infrastructure most companies use was built around discoverability (profiles, followers, stars) rather than around the work itself. That infrastructure systematically fails to find the developers who chose depth over visibility.

Companies that source from contribution data rather than profile data access a different pool than everyone else is fishing in. That pool is smaller and harder to work, but the hit rate on hires who stay and produce is noticeably different. Engineers who were building quietly before you found them tend to keep doing that after you hire them.

The ones who were loudest about finding their next opportunity often are.

Find the engineers who've already built it

Search 30M+ monthly GitHub events. Match on real code, not resumes.

Get started