๐ŸŽฏ What it does

CareerSignal is built around one idea: turn professional visibility into career opportunities. For an AI/Data candidate, that means consistently producing quality content, building a targeted network, and tracking every opportunity โ€” all with minimal manual effort.

The platform links four loops together: create professional signal โ†’ turn that signal into better conversations โ†’ track opportunities and follow-ups โ†’ feed everything with structured technical watch.

5
Core product engines
6+
FastAPI endpoints
2
LLM providers (Claude + OpenAI)
CI/CD
GitHub Actions + Docker

โš™๏ธ Core Engines

โœ๏ธ Content Engine
  • Generate Medium articles and LinkedIn posts from a topic, URL, PDF, or GitHub repo
  • QA loop with critique-refine before publication
  • Multi-provider LLM support (Claude, OpenAI)
๐Ÿ“… Publishing Engine
  • LinkedIn content calendar with editable date, time, content and hashtags
  • Approve / reject / re-open workflow
  • Time-aware autopublish worker
๐Ÿค Networking Engine
  • Search LinkedIn profiles and recruiter targets
  • Profile enrichment via scraping
  • Generate personalized connection notes
  • Outreach status tracking with rate limits
๐Ÿ’ผ Jobs Engine
  • Search and deduplicate LinkedIn job postings
  • SQLite-backed persistence for every opportunity
  • Status and notes tracking per saved job
  • Dedicated Streamlit UI page
๐Ÿ“ก Watch Engine
  • Collect articles from RSS feeds and scraped sources
  • Summarize technical content with LLMs
  • Suggest LinkedIn posts from watch articles

๐Ÿ—๏ธ Architecture

The repo is in active transition from a working prototype to a clean layered architecture. The target design separates business logic from runtime adapters and external integrations.

app/
Runtime adapters โ€” Streamlit UI, FastAPI, CLI, background workers
domain/
Pure business rules, entities, statuses, use cases
services/
External adapters โ€” LinkedIn, Medium, RSS, LLM providers
storage/
SQLite, migrations, repositories (replacing legacy JSON)
observability/
Structured logs, workflow run IDs, execution events

Every workflow carries a run_id. Every sensitive action leaves a persistent trace. SQLite is the source of truth โ€” JSON files are transitional legacy being progressively migrated.

๐Ÿ“Š Implementation Status

Feature Status
LinkedIn Scheduling + autopublish workerโœ“ Done
Jobs Tracker with SQLite persistenceโœ“ Done
FastAPI layer (health, db-init, jobs, autopublish)โœ“ Done
Content Engine (Medium + LinkedIn generation)โœ“ Done
Networking Engine (outreach + profile scraping)โœ“ Done
Docker + GitHub Actions CI/CDโœ“ Done
Migrate scheduling/outreach to SQLiteโ†’ Next
Recruiter enrichment in Jobs Trackerโ†’ Next
Workflow run persistence + audit trailsโ†’ Next
Analytics dashboardsLater
Multi-user support + authenticationLater