I got tired of checking job boards every morning. Open MyJob.mu, scroll, open LinkedIn, scroll, open CareerHub, scroll. Copy titles into a spreadsheet. Compare with my CV mentally. Repeat tomorrow.
So I built a thing that does all of that for me. Every hour. While I sleep, eat, or work on other things.
What It Does
Job Radar scrapes three Mauritius job boards hourly (MyJob.mu, LinkedIn, CareerHub.mu), scores each listing against my CV using AI, and pings me on WhatsApp when something scores 7 or higher.
At 5pm every day, I get an email digest of everything scored 5 and above.
That's it. No fancy UI. No web dashboard. Just a script, a database, and notifications.
The Stack
- Bun for the runtime (fast, native TypeScript, built-in SQLite)
- SQLite for storage (one file, zero config, good enough for thousands of jobs)
- Groq API running Llama 3.3 70B for AI scoring (fast inference, free tier)
- CallMeBot for WhatsApp notifications (free, works with a GET request)
- Gmail SMTP for email digests and fallback notifications
No frameworks. No Docker. No cloud infrastructure. Just a Bun script running on my machine.
How the Scrapers Work
Each job board needs a different approach.
MyJob.mu is old-school HTML. I scrape the search results page, parse the DOM with node-html-parser, fetch each job's detail page for the full description and contact email. There's anti-scraping stuff like zero-width spaces in titles that I strip out.
LinkedIn has a guest API that most people don't know about. No login needed, no API key, no OAuth. Just HTTP requests that return HTML fragments you parse with a DOM parser. I go into more detail on this below.
CareerHub.mu runs WordPress, so it has a clean REST API. Pure JSON, no HTML parsing needed. I use the _fields param to reduce each response from 31KB to about 3KB.
All three scrapers output the same shape: title, company, location, salary, URL, description, posted date, contact email. Everything normalized to ISO dates. Descriptions sanitized and truncated to 2000 chars.
The Scoring Pipeline
Every scraped job goes through two filters.
Filter 1: Keywords. A fast pre-filter checks if the job title or description contains any of my target keywords. Jobs that don't match skip AI entirely and get auto-scored as 1. This saves API calls.
Filter 2: AI scoring. Remaining jobs get sent to Groq (running Llama 3.3 70B) with my full CV attached. The AI scores each job 1-10 based on:
- Skills and tech overlap (30%)
- Seniority match (25%)
- Industry fit (20%)
- Location (15%)
- Salary alignment (10%)
There are also hard caps. Hands-on coding roles (50%+ code) get capped at 4 because I'm looking for management. Wrong tech stack (.NET, Java, SAP) gets a -2 penalty. Score 7+ requires management/leadership, stack overlap, 10+ years seniority, Mauritius location, and compatible salary.
The AI returns JSON with the score, a reason, key matches, and key gaps. Temperature 0 for consistency. I tested this against 25 hand-picked listings (English and French) and hit 80%+ accuracy within one point of my own scores.
Notifications
Score 7 or higher: instant WhatsApp message via CallMeBot. If WhatsApp fails, it falls back to email immediately. Never silently drops a notification.
Score 5 or higher: included in the daily digest email at 5pm. HTML table with all the day's matches, sorted by score.
Cross-source dedup prevents double-buzzing. If the same job title + company was already notified in the last 48 hours (from a different board), it skips the duplicate.
First run of the day sends a heartbeat message with yesterday's stats: how many jobs scanned, how many matched.
The Schedule
Runs every hour from 8am to 6pm, Monday through Friday. Mauritius time.
First run of the day uses a wider scrape window (24 hours) to catch overnight posts. Subsequent runs use a shorter window (1-2 hours) to catch new listings quickly.
A file lock prevents overlapping runs. If a run takes longer than 10 minutes, it force-stops.
What I Learned
Job boards are messy. Dates come in every format imaginable. Companies post the same role on three platforms with slightly different titles. Descriptions mix English and French mid-sentence. Zero-width spaces hide in titles. You need aggressive sanitization.
AI scoring works better than you'd think. With a well-structured prompt, clear scoring criteria, and hard caps for deal-breakers, the AI's scores closely match my own judgment. The key is being very specific about what matters and what disqualifies.
Speed matters in job hunting. Some roles on MyJob.mu get 50+ applications in the first day. Getting a WhatsApp ping within an hour of posting, with the score and a direct link, means I can apply before the pile grows.
The two-step filter saves money. Keyword pre-filtering catches obviously irrelevant jobs (driver, accountant, chef) without spending an API call. About 60% of scraped jobs get filtered this way.
What's Next
The backlog includes auto-drafting cover letters for high scores, adding more job sources (government portal, recruitment agencies, company career pages), salary intelligence tracking, and eventually auto-applying on MyJob.mu for scores 9 and above.
But honestly, the current version does everything I need. I check my phone, see the matches, click the links, and apply. The boring part is automated. The human judgment part (do I actually want this job?) stays with me.
LinkedIn's Hidden Guest Jobs API
Most people don't know this exists. LinkedIn has a public, unauthenticated API for job search. No login, no API key, no scraping detection. It's the same endpoint their public job search pages use internally.
Search endpoint:
GET https://www.linkedin.com/jobs-guest/jobs/api/seeMoreJobPostings/search
The parameters you need:
keywords: free-text search (e.g.project manager)location: location text (e.g.Mauritius)geoId: LinkedIn's numeric geo ID, more reliable than text (106931611for Mauritius)sortBy:DDfor date descending,Rfor relevancef_TPR: time filter usingr+ seconds.r3600= last hour,r86400= last 24h,r604800= last weekf_JT: job type.F= Full-time,P= Part-time,C= Contractf_E: experience level.1= Internship,2= Entry,3= Associate,4= Mid-Senior,5= Directorf_WT: work type.1= On-site,2= Remote,3= Hybridf_EA: set totruefor Easy Apply onlystart: pagination offset, page size is 10 (0,10,20, ...)
Multi-value filters use commas: f_JT=F,C or f_E=4,5.
To get the full job details:
GET https://www.linkedin.com/jobs-guest/jobs/api/jobPosting/{jobId}
Both endpoints return HTML fragments, not JSON, so you need a DOM parser like node-html-parser to extract the data. Rate limiting kicks in at high volume with a 429 response. A 60-second backoff handles it.