feat: v0.2.0 — Joplin import, ChatGPT Projects, --project filter

Core features:
- Add `joplin` command: syncs exported Markdown to Joplin via local REST API
- Notebooks auto-created per provider+project (e.g. "ChatGPT - My Project")
- Idempotent: notes updated (not duplicated) on re-run; note ID tracked in manifest
- Add `--project` filter to `export` and `list` commands (substring or 'none')
- Add ChatGPT Projects support via CHATGPT_PROJECT_IDS env var

Config:
- Add JOPLIN_API_TOKEN, JOPLIN_API_URL, JOPLIN_REQUEST_TIMEOUT
- Version now read from importlib.metadata (single source of truth: pyproject.toml)
- Bump version to 0.2.0

Quality:
- Explicit Timeout handling in JoplinClient with actionable error messages
- token validation (validate_token) separate from connectivity (ping)
- Remove debug_auth.py, debug_claude.py, and untracked .har file
- Add *.har to .gitignore (may contain auth cookies/session tokens)
- Update README, CHANGELOG, FUTURE.md to reflect v0.2.0

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
JesseMarkowitz
2026-03-01 06:04:03 -05:00
parent 23d7c17255
commit 304cf4fde4
16 changed files with 1795 additions and 133 deletions

View File

@@ -1,4 +1,4 @@
"""Local cache manifest for tracking exported conversations."""
"""Local cache manifest for tracking exported and Joplin-synced conversations."""
import json
import logging
@@ -18,11 +18,17 @@ class CacheError(Exception):
class Cache:
"""Manages the local JSON manifest of exported conversations.
"""Manages the local JSON manifest of exported and Joplin-synced conversations.
The manifest is the single source of truth for what has been exported.
Every run compares the provider's full conversation list against this
manifest to determine what is new or updated.
The manifest is the single source of truth for what has been exported and
synced. Every export run compares the provider's full conversation list
against this manifest to determine what is new or updated. The Joplin sync
run reads it to find conversations not yet pushed to Joplin (or re-exported
since the last sync).
Each entry tracks:
title, project, updated_at, exported_at, file_path,
joplin_note_id (after first sync), joplin_synced_at (after first sync)
File security:
- Permissions: 600 (owner read/write only)
@@ -150,6 +156,59 @@ class Cache:
"""Return all cached entries for a provider (for --cache --show)."""
return dict(self._data.get(provider, {}))
def mark_joplin_synced(self, provider: str, conv_id: str, note_id: str) -> None:
"""Record a successful Joplin sync for a conversation.
Adds ``joplin_note_id`` and ``joplin_synced_at`` to the manifest entry
and writes atomically to disk.
"""
entry = self._data.get(provider, {}).get(conv_id)
if entry is None:
logger.warning(
"[cache] mark_joplin_synced: no cache entry for %s/%s", provider, conv_id[:8]
)
return
entry["joplin_note_id"] = note_id
entry["joplin_synced_at"] = datetime.now(tz=timezone.utc).isoformat()
self._save()
def get_joplin_pending(self, provider: str) -> list[tuple[str, dict]]:
"""Return (conv_id, entry) pairs that need to be synced to Joplin.
A conversation is pending when:
- It has never been synced (no ``joplin_note_id``), OR
- It was re-exported after the last Joplin sync
(``exported_at`` > ``joplin_synced_at``).
Returns:
List of (conv_id, entry_dict) tuples, where entry_dict includes
``file_path``, ``title``, ``project``, and optionally ``joplin_note_id``.
"""
pending = []
for conv_id, entry in self._data.get(provider, {}).items():
if not isinstance(entry, dict):
continue
if not entry.get("file_path"):
continue
note_id = entry.get("joplin_note_id")
if not note_id:
pending.append((conv_id, entry))
continue
# Re-sync if the file was re-exported after the last Joplin sync
exported_at = entry.get("exported_at", "")
synced_at = entry.get("joplin_synced_at", "")
if exported_at and synced_at:
try:
from src.utils import _parse_dt
if _parse_dt(exported_at) > _parse_dt(synced_at):
pending.append((conv_id, entry))
except Exception:
pass
return pending
def last_run(self) -> str | None:
"""Return the ISO8601 timestamp of the last export run, or None."""
return self._data.get("last_run")