I’m exploring ways to collect LinkedIn post URLs based on a keyword search, with the goal of later reviewing those posts and commenting on them manually (no bots auto-commenting, no mass spam).
Concretely, I’d like to:
Search for posts that mention specific keywords (e.g. “we’re hiring”, “looking for X role”, etc.)
Collect the URLs of those posts (and maybe some basic metadata: author, date, text snippet)
Store them in a database or spreadsheet so I can go through them myself and decide if/what to comment
I’m aware LinkedIn’s ToS are quite strict around scraping and automation, and I don’t want to get an account banned or build something that’s outright abusive. I’m trying to understand what would be a realistic, non-shady approach here.
So my questions are:
Is there any legitimate / API-based way to do this at small scale?
If not, what’s the “least bad” / most ethical pattern people have seen in practice?
Browser automation with very low volume and human-in-the-loop?
Third-party tools that already provide some kind of post search / export?
Are there better alternatives I should consider (e.g. relying on other data sources, job boards, etc.) instead of touching LinkedIn directly?
I’m less interested in “here is how to bypass LinkedIn’s protections” and more in: "Given the constraints, what’s the most robust, boring, semi-automated setup you’d design?"
Curious how people who’ve actually built hiring / sourcing / lead-gen tools think about this.
Thanks!